Big 11.11 Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Free and Premium Google Professional-Cloud-Security-Engineer Dumps Questions Answers

Google Cloud Certified - Professional Cloud Security Engineer Questions and Answers

Question 1

You work for an ecommerce company that stores sensitive customer data across multiple Google Cloud regions. The development team has built a new 3-tier application to process orders and must integrate the application into the production environment. You must design the network architecture to ensure strong security boundaries and isolation for the new application, facilitate secure remote maintenance by authorized third-party vendors, and follow the principle of least privilege. What should you do?

Options:

A.

Create separate VPC networks for each tier. Use VPC peering between application tiers and other required VPCs. Provide vendors with SSH keys and root access only to the instances within the VPC for maintenance purposes.

B.

Create a single VPC network and create different subnets for each tier. Create a new Google project specifically for the third-party vendors and grant the network admin role to the vendors. Deploy a VPN appliance and rely on the vendors' configurations to secure third-party access.

C.

Create separate VPC networks for each tier. Use VPC peering between application tiers and other required VPCs. Enable Identity-Aware Proxy (IAP) for remote access to management resources, limiting access to authorized vendors.

D.

Create a single VPC network and create different subnets for each tier. Create a new Google project specifically for the third-party vendors. Grant the vendors ownership of that project and the ability to modify the Shared VPC configuration.

Buy Now
Question 2

Your organization is moving virtual machines (VMs) to Google Cloud. You must ensure that operating system images that are used across your projects are trusted and meet your security requirements.

What should you do?

Options:

A.

Implement an organization policy to enforce that boot disks can only be created from images that come from the trusted image project.

B.

Create a Cloud Function that is automatically triggered when a new virtual machine is created from the trusted image repository Verify that the image is not deprecated.

C.

Implement an organization policy constraint that enables the Shielded VM service on all projects to enforce the trusted image repository usage.

D.

Automate a security scanner that verifies that no common vulnerabilities and exposures (CVEs) are present in your trusted image repository.

Question 3

Your company operates an application instance group that is currently deployed behind a Google Cloud load balancer in us-central-1 and is configured to use the Standard Tier network. The infrastructure team wants to expand to a second Google Cloud region, us-east-2. You need to set up a single external IP address to distribute new requests to the instance groups in both regions.

What should you do?

Options:

A.

Change the load balancer backend configuration to use network endpoint groups instead of instance groups.

B.

Change the load balancer frontend configuration to use the Premium Tier network, and add the new instance group.

C.

Create a new load balancer in us-east-2 using the Standard Tier network, and assign a static external IP address.

D.

Create a Cloud VPN connection between the two regions, and enable Google Private Access.

Question 4

You need to connect your organization's on-premises network with an existing Google Cloud environment that includes one Shared VPC with two subnets named Production and Non-Production. You are required to:

Use a private transport link.

Configure access to Google Cloud APIs through private API endpoints originating from on-premises environments.

Ensure that Google Cloud APIs are only consumed via VPC Service Controls.

What should you do?

Options:

A.

1. Set up a Cloud VPN link between the on-premises environment and Google Cloud.2. Configure private access using the restricted googleapis.com domains in on-premises DNS configurations.

B.

1. Set up a Partner Interconnect link between the on-premises environment and Google Cloud.2. Configure private access using the private.googleapis.com domains in on-premises DNS configurations.

C.

1. Set up a Direct Peering link between the on-premises environment and Google Cloud.2. Configure private access for both VPC subnets.

D.

1. Set up a Dedicated Interconnect link between the on-premises environment and Google Cloud.2. Configure private access using the restricted.googleapis.com domains in on-premises DNS configurations.

Question 5

You are routing all your internet facing traffic from Google Cloud through your on-premises internet connection. You want to accomplish this goal securely and with the highest bandwidth possible.

What should you do?

Options:

A.

Create an HA VPN connection to Google Cloud Replace the default 0 0 0 0/0 route.

B.

Create a routing VM in Compute Engine Configure the default route with the VM as the next hop.

C.

Configure Cloud Interconnect with HA VPN Replace the default 0 0 0 0/0 route to an on-premises destination.

D.

Configure Cloud Interconnect and route traffic through an on-premises firewall.

Question 6

You want to evaluate GCP for PCI compliance. You need to identify Google’s inherent controls.

Which document should you review to find the information?

Options:

A.

Google Cloud Platform: Customer Responsibility Matrix

B.

PCI DSS Requirements and Security Assessment Procedures

C.

PCI SSC Cloud Computing Guidelines

D.

Product documentation for Compute Engine

Question 7

You are backing up application logs to a shared Cloud Storage bucket that is accessible to both the administrator and analysts. Analysts should not have access to logs that contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible to the administrator. What should you do?

Options:

A.

Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket.

B.

On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll.

C.

On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll.

D.

Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket.

Question 8

Your organization is using Google Workspace. Google Cloud, and a third-party SIEM. You need to export events such as user logins, successful logins, and failed logins to the SIEM. Logs need to be ingested in real time or near real-time. What should you do?

Options:

A.

Create a Cloud Logging sink to export relevant authentication logs to a Pub/Sub topic for SIEM subscription.

B.

Poll Cloud Logging for authentication events using the gcloud logging read tool. Forward the events to the SIEM.

C.

Configure Google Workspace to directly send logs to the API endpoint of the third-party SIEM.

D.

Create a Cloud Storage bucket as a sink for all logs. Configure the SIEM to periodically scan the bucket for new log files.

Question 9

You are responsible for managing identities in your company's Google Cloud organization. Employees are frequently using your organization's corporate domain name to create unmanaged Google accounts. You want to implement a practical and efficient solution to prevent employees from completing this action in the future. What should you do?

Options:

A.

Implement an automated process that scans all identities in your organization and disables any unmanaged accounts.

B.

Create a Google Cloud identity for all users in your organization. Ensure that new users are added automatically.

C.

Register a new domain for your Google Cloud resources. Move all existing identities and resources to this domain.

D.

Switch your corporate email system to another domain to avoid using the same domain for Google Cloud identities and corporate emails.

Question 10

You need to set up a Cloud interconnect connection between your company's on-premises data center and VPC host network. You want to make sure that on-premises applications can only access Google APIs over the Cloud Interconnect and not through the public internet. You are required to only use APIs that are supported by VPC Service Controls to mitigate against exfiltration risk to non-supported APIs. How should you configure the network?

Options:

A.

Enable Private Google Access on the regional subnets and global dynamic routing mode.

B.

Set up a Private Service Connect endpoint IP address with the API bundle of "all-apis", which is advertised as a route over the Cloud interconnect connection.

C.

Use private.googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the connection.

D.

Use restricted googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the Cloud Interconnect connection.

Question 11

An organization is evaluating the use of Google Cloud Platform (GCP) for certain IT workloads. A well- established directory service is used to manage user identities and lifecycle management. This directory service must continue for the organization to use as the “source of truth” directory for identities.

Which solution meets the organization's requirements?

Options:

A.

Google Cloud Directory Sync (GCDS)

B.

Cloud Identity

C.

Security Assertion Markup Language (SAML)

D.

Pub/Sub

Question 12

Your company is using Cloud Dataproc for its Spark and Hadoop jobs. You want to be able to create, rotate,

and destroy symmetric encryption keys used for the persistent disks used by Cloud Dataproc. Keys can be stored in the cloud.

What should you do?

Options:

A.

Use the Cloud Key Management Service to manage the data encryption key (DEK).

B.

Use the Cloud Key Management Service to manage the key encryption key (KEK).

C.

Use customer-supplied encryption keys to manage the data encryption key (DEK).

D.

Use customer-supplied encryption keys to manage the key encryption key (KEK).

Question 13

You have an application where the frontend is deployed on a managed instance group in subnet A and the data layer is stored on a mysql Compute Engine virtual machine (VM) in subnet B on the same VPC. Subnet A and Subnet B hold several other Compute Engine VMs. You only want to allow thee application frontend to access the data in the application's mysql instance on port 3306.

What should you do?

Options:

A.

Configure an ingress firewall rule that allows communication from the src IP range of subnet A to the tag "data-tag" that is applied to the mysql Compute Engine VM on port 3306.

B.

Configure an ingress firewall rule that allows communication from the frontend's unique service account to the unique service account of the mysql Compute Engine VM on port 3306.

C.

Configure a network tag "fe-tag" to be applied to all instances in subnet A and a network tag "data-tag" to be applied to all instances in subnet B. Then configure an egress firewall rule that allows communication from Compute Engine VMs tagged with data-tag to destination Compute Engine VMs tagged fe-tag.

D.

Configure a network tag "fe-tag" to be applied to all instances in subnet A and a network tag "data-tag" to be applied to all instances in subnet B. Then configure an ingress firewall rule that allows communication from Compute Engine VMs tagged with fe-tag to destination Compute Engine VMs tagged with data-tag.

Question 14

A customer deployed an application on Compute Engine that takes advantage of the elastic nature of cloud computing.

How can you work with Infrastructure Operations Engineers to best ensure that Windows Compute Engine VMs are up to date with all the latest OS patches?

Options:

A.

Build new base images when patches are available, and use a CI/CD pipeline to rebuild VMs, deploying incrementally.

B.

Federate a Domain Controller into Compute Engine, and roll out weekly patches via Group Policy Object.

C.

Use Deployment Manager to provision updated VMs into new serving Instance Groups (IGs).

D.

Reboot all VMs during the weekly maintenance window and allow the StartUp Script to download the latest patches from the internet.

Question 15

An organization is migrating from their current on-premises productivity software systems to G Suite. Some network security controls were in place that were mandated by a regulatory body in their region for their previous on-premises system. The organization’s risk team wants to ensure that network security controls are maintained and effective in G Suite. A security architect supporting this migration has been asked to ensure that network security controls are in place as part of the new shared responsibility model between the organization and Google Cloud.

What solution would help meet the requirements?

Options:

A.

Ensure that firewall rules are in place to meet the required controls.

B.

Set up Cloud Armor to ensure that network security controls can be managed for G Suite.

C.

Network security is a built-in solution and Google’s Cloud responsibility for SaaS products like G Suite.

D.

Set up an array of Virtual Private Cloud (VPC) networks to control network security as mandated by the relevant regulation.

Question 16

Your company has deployed an artificial intelligence model in a central project. This model has a lot of sensitive intellectual property and must be kept strictly isolated from the internet. You must expose the model endpoint only to a defined list of projects in your organization. What should you do?

Options:

A.

Within the model project, create an external Application Load Balancer that points to the model endpoint. Create a Cloud Armor policy to restrict IP addresses to Google Cloud.B. Within the model project, create an internal Application Load Balancer that points to the model endpoint. Expose this load balancer with Private Service Connect to a configured list of projects.

B.

Activate Private Google Access in both the model project and in each project that needs to connect to the model. Create a firewall policy to allow connectivity to Private Google Access addresses.

C.

Create a central project to host Shared VPC networks that are provided to all other projects. Centrally administer all firewall rules in this project to grant access to the model.

Question 17

Your company hosts a critical web application on Google Cloud The application is experiencing an increasing number of sophisticated layer 7 attacks, including cross-site scripting (XSS) and SQL injection attempts. You need to protect the application from these attacks while minimizing the impact on legitimate traffic and ensuring high availability. What should you do?

Options:

A.

Enable Google Cloud Armor's pre-configured WAF rules for OWASP Top 10 vulnerabilities at the backend service.

B.

Implement a load balancer in front of the web application instances, and enable Adaptive Protection and throttling to mitigate the occurrence of these malicious requests.

C.

Configure Cloud Next Generation Firewall to block known malicious IP addresses targeting /32 addresses.

D.

Configure a Cloud Armor security policy with customized and pre-configured WAF rules for OWASP Top 10 vulnerabilities at the load balancer.

Question 18

Your company recently published a security policy to minimize the usage of service account keys. On-premises Windows-based applications are interacting with Google Cloud APIs. You need to implement Workload Identity Federation (WIF) with your identity provider on-premises.

What should you do?

Options:

A.

Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Configure a rule to let principals in the pool impersonate the Google Cloud service account.

B.

Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Let all principals in the pool impersonate the Google Cloud service account.

C.

Set up a workload identity pool with an OpenID Connect (OIDC) service on the name machine Configure a rule to let principals in the pool impersonate the Google Cloud service account.

D.

Set up a workload identity pool with an OpenID Connect (OIDC) service on the same machine Let all principals in the pool impersonate the Google Cloud service account.

Question 19

A large e-retailer is moving to Google Cloud Platform with its ecommerce website. The company wants to ensure payment information is encrypted between the customer’s browser and GCP when the customers checkout online.

What should they do?

Options:

A.

Configure an SSL Certificate on an L7 Load Balancer and require encryption.

B.

Configure an SSL Certificate on a Network TCP Load Balancer and require encryption.

C.

Configure the firewall to allow inbound traffic on port 443, and block all other inbound traffic.

D.

Configure the firewall to allow outbound traffic on port 443, and block all other outbound traffic.

Question 20

You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides.

What should you do?

Options:

A.

Enable Access Transparency Logging.

B.

Deploy resources only to regions permitted by data residency requirements

C.

Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.

D.

Deploy Assured Workloads.

Question 21

A customer is running an analytics workload on Google Cloud Platform (GCP) where Compute Engine instances are accessing data stored on Cloud Storage. Your team wants to make sure that this workload will not be able to access, or be accessed from, the internet.

Which two strategies should your team use to meet these requirements? (Choose two.)

Options:

A.

Configure Private Google Access on the Compute Engine subnet

B.

Avoid assigning public IP addresses to the Compute Engine cluster.

C.

Make sure that the Compute Engine cluster is running on a separate subnet.

D.

Turn off IP forwarding on the Compute Engine instances in the cluster.

E.

Configure a Cloud NAT gateway.

Question 22

Your organization is rolling out a new continuous integration and delivery (CI/CD) process to deploy infrastructure and applications in Google Cloud Many teams will use their own instances of the CI/CD workflow It will run on Google Kubernetes Engine (GKE) The CI/CD pipelines must be designed to securely access Google Cloud APIs

What should you do?

Options:

A.

• 1 Create a dedicated service account for the CI/CD pipelines• 2 Run the deployment pipelines in a dedicated nodes pool in the GKE cluster• 3 Use the service account that you created as identity for the nodes in the pool to authenticate to the Google Cloud APIs

B.

• 1 Create service accounts for each deployment pipeline• 2 Generate private keys for the service accounts• 3 Securely store the private keys as Kubernetes secrets accessible only by the pods that run the specific deploy pipeline

C.

* 1 Create individual service accounts (or each deployment pipeline• 2 Add an identifier for the pipeline in the service account naming convention• 3 Ensure each pipeline runs on dedicated pods• 4 Use workload identity to map a deployment pipeline pod with a service account

D.

• 1 Create two service accounts one for the infrastructure and one for the application deployment• 2 Use workload identities to let the pods run the two pipelines and authenticate with the service accounts• 3 Run the infrastructure and application pipelines in separate namespaces

Question 23

Your Security team believes that a former employee of your company gained unauthorized access to Google Cloud resources some time in the past 2 months by using a service account key. You need to confirm the unauthorized access and determine the user activity. What should you do?

Options:

A.

Use Security Health Analytics to determine user activity.

B.

Use the Cloud Monitoring console to filter audit logs by user.

C.

Use the Cloud Data Loss Prevention API to query logs in Cloud Storage.

D.

Use the Logs Explorer to search for user activity.

Question 24

Your organization hosts a financial services application running on Compute Engine instances for a third-party company. The third-party company’s servers that will consume the application also run on Compute Engine in a separate Google Cloud organization. You need to configure a secure network connection between the Compute Engine instances. You have the following requirements:

    The network connection must be encrypted.

    The communication between servers must be over private IP addresses.

What should you do?

Options:

A.

Configure a Cloud VPN connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

B.

Configure a VPC peering connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

C.

Configure a VPC Service Controls perimeter around your Compute Engine instances, and provide access to the third party via an access level.

D.

Configure an Apigee proxy that exposes your Compute Engine-hosted application as an API, and is encrypted with TLS which allows access only to the third party.

Question 25

A security audit uncovered several inconsistencies in your project's Identity and Access Management (IAM) configuration. Some service accounts have overly permissive roles, and a few external collaborators have more access than necessary. You need to gain detailed visibility into changes to IAM policies, user activity, service account behavior, and access to sensitive projects. What should you do?

Options:

A.

Enable the metrics explorer in Cloud Monitoring to follow the service account authentication events and build alerts linked on it.​

B.

Use Cloud Audit Logs. Create log export sinks to send these logs to a security information and event management (SIEM) solution for correlation with other event sources.​

C.

Configure Google Cloud Functions to be triggered by changes to IAM policies. Analyze changes by using the policy simulator, send alerts upon risky modifications, and store event details.​

D.

Deploy the OS Config Management agent to your VMs. Use OS Config Management to create patch management jobs and monitor system modifications.​

Question 26

Your organization s customers must scan and upload the contract and their driver license into a web portal in Cloud Storage. You must remove all personally identifiable information (Pll) from files that are older than 12 months. Also you must archive the anonymized files for retention purposes.

What should you do?

Options:

A.

Set a time to live (TTL) of 12 months for the files in the Cloud Storage bucket that removes PH and moves the files to the archive storage class.

B.

Create a Cloud Data Loss Prevention (DLP) inspection job that de-identifies Pll in files created more than 12 months ago and archives them to another Cloud Storage bucket. Delete the original files.

C.

Schedule a Cloud Key Management Service (KMS) rotation period of 12 months for the encryption keys of the Cloud Storage files containing Pll to de-identify them Delete the original keys.

D.

Configure the Autoclass feature of the Cloud Storage bucket to de-identify Pll Archive the files that are older than 12 months Delete the original files.

Question 27

A customer wants to deploy a large number of 3-tier web applications on Compute Engine.

How should the customer ensure authenticated network separation between the different tiers of the application?

Options:

A.

Run each tier in its own Project, and segregate using Project labels.

B.

Run each tier with a different Service Account (SA), and use SA-based firewall rules.

C.

Run each tier in its own subnet, and use subnet-based firewall rules.

D.

Run each tier with its own VM tags, and use tag-based firewall rules.

Question 28

You are onboarding new users into Cloud Identity and discover that some users have created consumer user accounts using the corporate domain name. How should you manage these consumer user accounts with Cloud Identity?

Options:

A.

Use Google Cloud Directory Sync to convert the unmanaged user accounts.

B.

Create a new managed user account for each consumer user account.

C.

Use the transfer tool for unmanaged user accounts.

D.

Configure single sign-on using a customer's third-party provider.

Question 29

Your company is using GSuite and has developed an application meant for internal usage on Google App Engine. You need to make sure that an external user cannot gain access to the application even when an employee’s password has been compromised.

What should you do?

Options:

A.

Enforce 2-factor authentication in GSuite for all users.

B.

Configure Cloud Identity-Aware Proxy for the App Engine Application.

C.

Provision user passwords using GSuite Password Sync.

D.

Configure Cloud VPN between your private network and GCP.

Question 30

You are responsible for the operation of your company's application that runs on Google Cloud. The database for the application will be maintained by an external partner. You need to give the partner team access to the database. This access must be restricted solely to the database and cannot extend to any other resources within your company's network. Your solution should follow Google-recommended practices. What should you do?

Options:

A.

Add a public IP address to the application's database. Create database users for each of the partner's employees. Securely distribute the credentials for these users to the partner team.

B.

Create accounts for the partner team in your corporate identity provider. Synchronize these accounts with Google Cloud Identity. Grant the accounts access to the database.

C.

Ask the partner team to set up Cloud Identity accounts within their own corporate environment and identity provider. Grant the partner’s Cloud Identity accounts access to the database.

D.

Configure Workforce Identity Federation for the partner. Connect the identity pool provider to the partner's identity provider. Grant the workforce pool resources access to the database.

Question 31

When creating a secure container image, which two items should you incorporate into the build if possible? (Choose two.)

Options:

A.

Ensure that the app does not run as PID 1.

B.

Package a single app as a container.

C.

Remove any unnecessary tools not needed by the app.

D.

Use public container images as a base image for the app.

E.

Use many container image layers to hide sensitive information.

Question 32

Your company requires the security and network engineering teams to identify all network anomalies and be able to capture payloads within VPCs. Which method should you use?

Options:

A.

Define an organization policy constraint.

B.

Configure packet mirroring policies.

C.

Enable VPC Flow Logs on the subnet.

D.

Monitor and analyze Cloud Audit Logs.

Question 33

A company allows every employee to use Google Cloud Platform. Each department has a Google Group, with

all department members as group members. If a department member creates a new project, all members of that department should automatically have read-only access to all new project resources. Members of any other department should not have access to the project. You need to configure this behavior.

What should you do to meet these requirements?

Options:

A.

Create a Folder per department under the Organization. For each department’s Folder, assign the Project Viewer role to the Google Group related to that department.

B.

Create a Folder per department under the Organization. For each department’s Folder, assign the Project Browser role to the Google Group related to that department.

C.

Create a Project per department under the Organization. For each department’s Project, assign the Project Viewer role to the Google Group related to that department.

D.

Create a Project per department under the Organization. For each department’s Project, assign the Project Browser role to the Google Group related to that department.

Question 34

A customer wants to make it convenient for their mobile workforce to access a CRM web interface that is hosted on Google Cloud Platform (GCP). The CRM can only be accessed by someone on the corporate network. The customer wants to make it available over the internet. Your team requires an authentication layer in front of the application that supports two-factor authentication

Which GCP product should the customer implement to meet these requirements?

Options:

A.

Cloud Identity-Aware Proxy

B.

Cloud Armor

C.

Cloud Endpoints

D.

Cloud VPN

Question 35

You are setting up Cloud Identity for your company's Google Cloud organization. User accounts will be provisioned from Microsoft Entra ID through Directory Sync, and there will be single sign-on through Entra ID. You need to secure the super administrator accounts for the organization. Your solution must follow the principle of least privilege and implement strong authentication. What should you do?

Options:

A.

Create dedicated accounts for super administrators. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

B.

Create dedicated accounts for super administrators. Enforce Google 2-step verification for the super administrator accounts.

C.

Create accounts that combine the organization administrator and the super administrator privileges. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

D.

Create accounts that combine the organization administrators and the super administrator privileges. Enforce Google 2-step verification for the super administrator accounts.

Question 36

You must ensure that the keys used for at-rest encryption of your data are compliant with your organization's security controls. One security control mandates that keys get rotated every 90 days. You must implement an effective detection strategy to validate if keys are rotated as required. What should you do?​

Options:

A.

Analyze the crypto key versions of the keys by using data from Cloud Asset Inventory. If an active key is older than 90 days, send an alert message through your incident notification channel.​

B.

Identify keys that have not been rotated by using Security Health Analytics. If a key is not rotated after 90 days, a finding in Security Command Center is raised.​

C.

Assess the keys in the Cloud Key Management Service by implementing code in Cloud Run. If a key is not rotated after 90 days, raise a finding in Security Command Center.​

D.

Define a metric that checks for timely key updates by using Cloud Logging. If a key is not rotated after 90 days, send an alert message through your incident notification channel.​

Question 37

A centralized security service has been implemented by your company. All applications running in Google Cloud are required to send data to this service. You need to ensure that developers have high autonomy to configure firewall rules within their projects, while preventing accidental blockage of access to the central security service. What should you do?

Options:

A.

Deploy a central Secure Web Proxy and connect it to all VPC networks. Create a Secure Web Proxy policy to allow traffic to the central security service.

B.

Implement a hierarchical firewall policy that prioritizes the central security service by allowing its connections and directing all other traffic to the subsequent firewall level.

C.

Create a central project to manage Shared VPC networks which will be accessible to all other projects. Administer all firewall rules centrally within this project.

D.

Use Terraform to automate the creation of the required firewall rule in all projects. Restrict rule change permissions solely to the Terraform service account.

Question 38

After completing a security vulnerability assessment, you learned that cloud administrators leave Google Cloud CLI sessions open for days. You need to reduce the risk of attackers who might exploit these open sessions by setting these sessions to the minimum duration.

What should you do?

Options:

A.

Set the session duration for the Google session control to one hour.

B.

Set the reauthentication frequency (or the Google Cloud Session Control to one hour.

C.

Set the organization policy constraintconstraints/iam.allowServiceAccountCredentialLifetimeExtension to one hour.

D.

Set the organization policy constraint constraints/iam. serviceAccountKeyExpiryHours to onehour and inheritFromParent to false.

Question 39

A customer has an analytics workload running on Compute Engine that should have limited internet access.

Your team created an egress firewall rule to deny (priority 1000) all traffic to the internet.

The Compute Engine instances now need to reach out to the public repository to get security updates. What should your team do?

Options:

A.

Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority greater than 1000.

B.

Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority less than 1000.

C.

Create an egress firewall rule to allow traffic to the hostname of the repository with a priority greater than 1000.

D.

Create an egress firewall rule to allow traffic to the hostname of the repository with a priority less than 1000.

Question 40

You have the following resource hierarchy. There is an organization policy at each node in the hierarchy as shown. Which load balancer types are denied in VPC A?

Options:

A.

All load balancer types are denied in accordance with the global node’s policy.

B.

INTERNAL_TCP_UDP, INTERNAL_HTTP_HTTPS is denied in accordance with the folder’s policy.

C.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY are denied in accordance with the project’s policy.

D.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY, INTERNAL_TCP_UDP, and INTERNAL_HTTP_HTTPS are denied in accordance with the folder and project’s policies.

Question 41

Your organization uses a microservices architecture based on Google Kubernetes Engine (GKE). Security reviews recommend tighter controls around deployed container images to reduce potential vulnerabilities and maintain compliance. You need to implement an automated system by using managed services to ensure that only approved container images are deployed to the GKE clusters. What should you do?

Options:

A.

Enforce Binary Authorization in your GKE clusters. Integrate container image vulnerability scanning into the CI/CD pipeline and require vulnerability scan results to be used for Binary Authorization policy decisions.​

B.

Develop custom organization policies that restrict GKE cluster deployments to container images hosted within a specific Artifact Registry project where your approved images reside.​

C.

Build a system using third-party vulnerability databases and custom scripts to identify potential Common Vulnerabilities and Exposures (CVEs) in your container images. Prevent image deployment if the CVE impact score is beyond a specified threshold.​

D.

Automatically deploy new container images upon successful CI/CD builds by using Cloud Build triggers. Set up firewall rules to limit and control access to instances to mitigate malware injection.​

Question 42

How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system?

Options:

A.

Send all logs to the SIEM system via an existing protocol such as syslog.

B.

Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system.

C.

Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow.

D.

Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.

Question 43

Your company's storage team manages all product images within a specific Google Cloud project. To maintain control, you must isolate access to Cloud Storage for this project, allowing the storage team to manage restrictions at the project level. They must be restricted to using corporate computers. What should you do?

Options:

A.

Employ organization-level firewall rules to block all traffic to Cloud Storage. Create exceptions for specific service accounts used by the storage team within their project.

B.

Implement VPC Service Controls by establishing an organization-wide service perimeter with all projects. Configure ingress and egress rules to restrict access to Cloud Storage based on IP address ranges.

C.

Use Context-Aware Access. Create an access level that defines the required context. Apply it as an organization policy specifically at the project level, restricting access to Cloud Storage based on that context.

D.

Use Identity and Access Management (IAM) roles at the project level within the storage team's project. Grant the storage team granular permissions on the project's Cloud Storage resources.

Question 44

Your organization operates in a highly regulated industry and uses multiple Google Cloud services. You need to identify potential risks to regulatory compliance. Which situation introduces the greatest risk?

Options:

A.

Principals have broad IAM roles allowing the creation and management of Compute Engine VMs without a pre-defined hardening process.

B.

Sensitive data is stored in a Cloud Storage bucket with the uniform bucket-level access setting enabled.

C.

The security team mandates the use of customer-managed encryption keys (CMEK) for all data classified as sensitive.

D.

The audit team needs access to Cloud Audit Logs related to managed services like BigQuery.

Question 45

A company is running their webshop on Google Kubernetes Engine and wants to analyze customer transactions in BigQuery. You need to ensure that no credit card numbers are stored in BigQuery

What should you do?

Options:

A.

Create a BigQuery view with regular expressions matching credit card numbers to query and delete affected rows.

B.

Use the Cloud Data Loss Prevention API to redact related infoTypes before data is ingested into BigQuery.

C.

Leverage Security Command Center to scan for the assets of type Credit Card Number in BigQuery.

D.

Enable Cloud Identity-Aware Proxy to filter out credit card numbers before storing the logs in BigQuery.

Question 46

You are managing a set of Google Cloud projects that are contained in a folder named Data Warehouse A new data analysis team has been approved to perform data analysis for all BigQuery data in the projects within the Data Warehouse folder. They should only be able to read the data and not have permissions to modify or delete the data. You want to reduce the operational overhead of provisioning access while adhering to the principle of least privilege. What should you do?

Options:

A.

Grant the BigQuery Data Viewer role at the dataset level for each BigQuery dataset within each project in the Data Warehouse folder

B.

Grant the BigQuery Data Viewer role at the Data Warehouse folder.

C.

Grant the BigQuery Data Viewer role at the project level for each project within the Data Warehouse folder.

D.

Grant the BigQuery Metadata Viewer role at the Data Warehouse folder

Question 47

You manage your organization’s Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your VPCs based on network logs. However, you want to explore your environment using network payloads and headers. Which Google Cloud product should you use?

Options:

A.

Cloud IDS

B.

VPC Service Controls logs

C.

VPC Flow Logs

D.

Google Cloud Armor

E.

Packet Mirroring

Question 48

Your organization is deploying a serverless web application on Cloud Run that must be publicly accessible over HTTPS. To meet security requirements, you need to terminate TLS at the edge, apply threat mitigation, and prepare for geo-based access restrictions. What should you do?

Options:

A.

Make the Cloud Run service public by enabling allUsers access. Configure Identity-Aware Proxy (IAP) for authentication and IP-based access control. Use custom SSL certificates for HTTPS.

B.

Assign a custom domain to the Cloud Run service. Enable HTTPS. Configure IAM to allow allUsers to invoke the service. Use firewall rules and VPC Service Controls for geo-based restriction and traffic filtering.

C.

Deploy an external HTTP(S) load balancer with a serverless NEG that points to the Cloud Run service. Use a Google-managed certificate for TLS termination. Configure a Cloud Armor policy with geo-based access control.

D.

Create a Cloud DNS public zone for the Cloud Run URL. Bind a static IP to the service. Use VPC firewall rules to restrict incoming traffic based on IP ranges and threat signatures.

Question 49

A security audit uncovered several inconsistencies in your project’s Identity and Access Management (IAM) configuration. Some service accounts have overly permissive roles, and a few external collaborators have more access than necessary. You need to gain detailed visibility into changes to IAM policies, user activity, service account behavior, and access to sensitive projects. What should you do?

Options:

A.

Deploy the OS Config Management agent to your VMs. Use OS Config Management to create patch management jobs and monitor system modifications.

B.

Enable the metrics explorer in Cloud Monitoring to follow the service account authentication events and build alerts linked on it.

C.

Use Cloud Audit Logs. Create log export sinks to send these logs to a security information and event management (SIEM) solution for correlation with other event sources.

D.

Configure Google Cloud Functions to be triggered by changes to IAM policies. Analyze changes by using the policy simulator, send alerts upon risky modifications, and store event details.

Question 50

Your team wants to limit users with administrative privileges at the organization level.

Which two roles should your team restrict? (Choose two.)

Options:

A.

Organization Administrator

B.

Super Admin

C.

GKE Cluster Admin

D.

Compute Admin

E.

Organization Role Viewer

Question 51

An organization adopts Google Cloud Platform (GCP) for application hosting services and needs guidance on setting up password requirements for their Cloud Identity account. The organization has a password policy requirement that corporate employee passwords must have a minimum number of characters.

Which Cloud Identity password guidelines can the organization use to inform their new requirements?

Options:

A.

Set the minimum length for passwords to be 8 characters.

B.

Set the minimum length for passwords to be 10 characters.

C.

Set the minimum length for passwords to be 12 characters.

D.

Set the minimum length for passwords to be 6 characters.

Question 52

An office manager at your small startup company is responsible for matching payments to invoices and creating billing alerts. For compliance reasons, the office manager is only permitted to have the Identity and Access Management (IAM) permissions necessary for these tasks. Which two IAM roles should the office manager have? (Choose two.)

Options:

A.

Organization Administrator

B.

Project Creator

C.

Billing Account Viewer

D.

Billing Account Costs Manager

E.

Billing Account User

Question 53

You recently joined the networking team supporting your company's Google Cloud implementation. You are tasked with familiarizing yourself with the firewall rules configuration and providing recommendations based on your networking and Google Cloud experience. What product should you recommend to detect firewall rules that are overlapped by attributes from other firewall rules with higher or equal priority?

Options:

A.

Security Command Center

B.

Firewall Rules Logging

C.

VPC Flow Logs

D.

Firewall Insights

Question 54

Your organization has hired a small, temporary partner team for 18 months. The temporary team will work alongside your DevOps team to develop your organization's application that is hosted on Google Cloud. You must give the temporary partner team access to your application's resources on Google Cloud and ensure that partner employees lose access if they are removed from their employer's organization. What should you do?

Options:

A.

Implement just-in-time privileged access to Google Cloud for the temporary partner team.

B.

Create a temporary username and password for the temporary partner team members. Auto-clean the usernames and passwords after the work engagement has ended.

C.

Add the identities of the temporary partner team members to your identity provider (IdP).

D.

Create a workforce identity pool and federate the identity pool with the identity provider (IdP) of the temporary partner team.

Question 55

You have been tasked with implementing external web application protection against common web application attacks for a public application on Google Cloud. You want to validate these policy changes before they are enforced. What service should you use?

Options:

A.

Google Cloud Armor's preconfigured rules in preview mode

B.

Prepopulated VPC firewall rules in monitor mode

C.

The inherent protections of Google Front End (GFE)

D.

Cloud Load Balancing firewall rules

E.

VPC Service Controls in dry run mode

Question 56

Your organization uses Google Workspace Enterprise Edition tor authentication. You are concerned about employees leaving their laptops unattended for extended periods of time after authenticating into Google Cloud. You must prevent malicious people from using an employee's unattended laptop to modify their environment.

What should you do?

Options:

A.

Create a policy that requires employees to not leave their sessions open for long durations.

B.

Review and disable unnecessary Google Cloud APIs.

C.

Require strong passwords and 2SV through a security token or Google authenticate.

D.

Set the session length timeout for Google Cloud services to a shorter duration.

Question 57

Your organization has 3 TB of information in BigQuery and Cloud SQL. You need to develop a cost-effective, scalable, and secure strategy to anonymize the personally identifiable information (PII) that exists today. What should you do?

Options:

A.

Scan your BigQuery and Cloud SQL data using the Cloud DLP data profiling feature. Use the data profiling results to create a de-identification strategy with either Cloud Sensitive Data Protection's de-identification templates or custom configurations.

B.

Create a new BigQuery dataset and Cloud SQL instance. Copy a small subset of the data to these new locations. Use Cloud Data Loss Prevention API to scan this subset for PII. Based on the results, create a custom anonymization script and apply the script to the entire 3 TB dataset in the original locations.

C.

Export all 3TB of data from BigQuery and Cloud SQL to Cloud Storage. Use Cloud Sensitive Data Protection to anonymize the exported data. Re-import the anonymized data back into BigQuery and Cloud SQL.

D.

Inspect a representative sample of the data in BigQuery and Cloud SQL to identify PII. Based on this analysis, develop a custom script to anonymize the identified PII.

Question 58

You need to use Cloud External Key Manager to create an encryption key to encrypt specific BigQuery data at rest in Google Cloud. Which steps should you do first?

Options:

A.

1. Create or use an existing key with a unique uniform resource identifier (URI) in your Google Cloud project.2. Grant your Google Cloud project access to a supported external key management partner system.

B.

1. Create or use an existing key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).2. In Cloud KMS, grant your Google Cloud project access to use the key.

C.

1. Create or use an existing key with a unique uniform resource identifier (URI) in a supported external key management partner system.2. In the external key management partner system, grant access for this key to use your Google Cloud project.

D.

1. Create an external key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).2. In Cloud KMS, grant your Google Cloud project access to use the key.

Question 59

You are a security administrator at your company. Per Google-recommended best practices, you implemented the domain restricted sharing organization policy to allow only required domains to access your projects. An engineering team is now reporting that users at an external partner outside your organization domain cannot be granted access to the resources in a project. How should you make an exception for your partner's domain while following the stated best practices?

Options:

A.

Turn off the domain restriction sharing organization policy. Set the policy value to "Allow All."

B.

Turn off the domain restricted sharing organization policy. Provide the external partners with the required permissions using Google's Identity and Access Management (IAM) service.

C.

Turn off the domain restricted sharing organization policy. Add each partner's Google Workspace customer ID to a Google group, add the Google group as an exception under the organization policy, and then turn the policy back on.

D.

Turn off the domain restricted sharing organization policy. Set the policy value to "Custom." Add each external partner's Cloud Identity or Google Workspace customer ID as an exception under the organization policy, and then turn the policy back on.

Question 60

Your financial services company needs to process customer personally identifiable information (PII) for analytics while adhering to strict privacy regulations. You must transform this data to protect individual privacy to ensure that the data retains its original format and consistency for analytical integrity. Your solution must avoid full irreversible deletion. What should you do?

Options:

A.

Configure Sensitive Data Protection (SDP) to de-identify PII using format-preserving encryption (FPE).

B.

Use Cloud Key Management Service (Cloud KMS) to encrypt the entire dataset with a customer-managed encryption key (CMEK).

C.

Implement a custom BigQuery user-defined function (UDF) by using JavaScript to hash all sensitive fields before they are loaded into the analytical tables.

D.

Set up VPC Service Controls around the BigQuery project. Implement row-level encryption.

Question 61

Your organization has recently migrated sensitive customer data to Cloud Storage buckets. For compliance reasons, you must ensure that all vendor data access and administrative access by Google personnel is logged. What should you do?

Options:

A.

Configure Data Access audit logs for Cloud Storage on the project hosting the Cloud Storage buckets.

B.

Enable Access Transparency for the organization.

C.

Configure Data Access audit logs for Cloud Storage at the organization level.

D.

Enable Access Transparency for the project hosting the Cloud Storage buckets.

Question 62

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its current data backup and disaster recovery solutions to GCP for later analysis. The organization’s production environment will remain on- premises for an indefinite time. The organization wants a scalable and cost-efficient solution.

Which GCP solution should the organization use?

Options:

A.

BigQuery using a data pipeline job with continuous updates

B.

Cloud Storage using a scheduled task and gsutil

C.

Compute Engine Virtual Machines using Persistent Disk

D.

Cloud Datastore using regularly scheduled batch upload jobs

Question 63

Your organization leverages folders to represent different teams within your Google Cloud environment. To support Infrastructure as Code (IaC) practices, each team receives a dedicated service account upon onboarding. You want to ensure that teams have comprehensive permissions to manage resources within their assigned folders while adhering to the principle of least privilege. You must design the permissions for these team-based service accounts in the most effective way possible. What should you do?​

Options:

A.

Grant each service account the folder administrator role on its respective folder.​

B.

Grant each service account the project creator role at the organization level and use folder-level IAM conditions to restrict project creation to specific folders.​Reddit

C.

Assign each service account the project editor role at the organization level and instruct teams to use IAM bindings at the folder level for fine-grained permissions.​

D.

Assign each service account the folder IAM administrator role on its respective folder to allow teams to create and manage additional custom roles if needed.​

Question 64

A website design company recently migrated all customer sites to App Engine. Some sites are still in progress and should only be visible to customers and company employees from any location.

Which solution will restrict access to the in-progress sites?

Options:

A.

Upload an .htaccess file containing the customer and employee user accounts to App Engine.

B.

Create an App Engine firewall rule that allows access from the customer and employee networks and denies all other traffic.

C.

Enable Cloud Identity-Aware Proxy (IAP), and allow access to a Google Group that contains the customer and employee user accounts.

D.

Use Cloud VPN to create a VPN connection between the relevant on-premises networks and the company’s GCP Virtual Private Cloud (VPC) network.

Question 65

Your company’s chief information security officer (CISO) is requiring business data to be stored in specific locations due to regulatory requirements that affect the company’s global expansion plans. After working on a plan to implement this requirement, you determine the following:

    The services in scope are included in the Google Cloud data residency requirements.

    The business data remains within specific locations under the same organization.

    The folder structure can contain multiple data residency locations.

    The projects are aligned to specific locations.

You plan to use the Resource Location Restriction organization policy constraint with very granular control. At which level in the hierarchy should you set the constraint?

Options:

A.

Organization

B.

Resource

C.

Project

D.

Folder

Question 66

You are part of a security team investigating a compromised service account key. You need to audit which new resources were created by the service account.

What should you do?

Options:

A.

Query Data Access logs.

B.

Query Admin Activity logs.

C.

Query Access Transparency logs.

D.

Query Stackdriver Monitoring Workspace.

Question 67

You are tasked with exporting and auditing security logs for login activity events for Google Cloud console and API calls that modify configurations to Google Cloud resources. Your export must meet the following requirements:

Export related logs for all projects in the Google Cloud organization.

Export logs in near real-time to an external SIEM.

What should you do? (Choose two.)

Options:

A.

Create a Log Sink at the organization level with a Pub/Sub destination.

B.

Create a Log Sink at the organization level with the includeChildren parameter, and set the destination to a Pub/Sub topic.

C.

Enable Data Access audit logs at the organization level to apply to all projects.

D.

Enable Google Workspace audit logs to be shared with Google Cloud in the Admin Console.

E.

Ensure that the SIEM processes the AuthenticationInfo field in the audit log entry to gather identity information.

Question 68

You need to follow Google-recommended practices to leverage envelope encryption and encrypt data at the application layer.

What should you do?

Options:

A.

Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the encrypted DEK.

B.

Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the KEK.

C.

Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the encrypted DEK.

D.

Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the KEK.

Question 69

Your team uses a service account to authenticate data transfers from a given Compute Engine virtual machine instance of to a specified Cloud Storage bucket. An engineer accidentally deletes the service account, which breaks application functionality. You want to recover the application as quickly as possible without compromising security.

What should you do?

Options:

A.

Temporarily disable authentication on the Cloud Storage bucket.

B.

Use the undelete command to recover the deleted service account.

C.

Create a new service account with the same name as the deleted service account.

D.

Update the permissions of another existing service account and supply those credentials to the applications.

Question 70

Your organization has Google Cloud applications that require access to external web services. You must monitor, control, and log access to these services. What should you do?

Options:

A.

Configure VPC firewall rules to allow the services to access the IP addresses of required external web services.

B.

Set up a Secure Web Proxy that allows access to the specific external web services. Configure applications to use the proxy for the web service requests.

C.

Configure Google Cloud Armor to monitor and protect your applications by checking incoming traffic patterns for attack patterns.

D.

Set up a Cloud NAT instance to allow egress traffic from your VPC.

Question 71

Your organization deploys a large number of containerized applications on Google Kubernetes Engine (GKE). Node updates are currently applied manually. Audit findings show that a critical patch has not been installed due to a missed notification. You need to design a more reliable, cloud-first, and scalable process for node updates. What should you do?​

Options:

A.

Migrate the cluster infrastructure to a self-managed Kubernetes environment for greater control over the patching process.​

B.

Develop a custom script to continuously check for patch availability, download patches, and apply the patches across all components of the cluster.​

C.

Schedule a daily reboot for all nodes to automatically upgrade.​

D.

Configure node auto-upgrades for node pools in the maintenance windows.​

Question 72

Options:

A.

Configure IAM permissions on individual Model Garden to restrict access to specific models.

B.

Regularly audit user activity logs in Vertex AI to identify and revoke access to unapproved models.

C.

Train custom models within your Vertex AI project and restrict user access to these models.

D.

Implement an organization policy that restricts the vertexai.allowedModels constraint.

Question 73

You are setting up a CI/CD pipeline to deploy containerized applications to your production clusters on Google Kubernetes Engine (GKE). You need to prevent containers with known vulnerabilities from being deployed. You have the following requirements for your solution:

Must be cloud-native

Must be cost-efficient

Minimize operational overhead

How should you accomplish this? (Choose two.)

Options:

A.

Create a Cloud Build pipeline that will monitor changes to your container templates in a Cloud Source Repositories repository. Add a step to analyze Container Analysis results before allowing the build to continue.

B.

Use a Cloud Function triggered by log events in Google Cloud's operations suite to automatically scan your container images in Container Registry.

C.

Use a cron job on a Compute Engine instance to scan your existing repositories for known vulnerabilities and raise an alert if a non-compliant container image is found.

D.

Deploy Jenkins on GKE and configure a CI/CD pipeline to deploy your containers to Container Registry. Add a step to validate your container images before deploying your container to the cluster.

E.

In your CI/CD pipeline, add an attestation on your container image when no vulnerabilities have been found. Use a Binary Authorization policy to block deployments of containers with no attestation in your cluster.

Question 74

You are implementing data protection by design and in accordance with GDPR requirements. As part of design reviews, you are told that you need to manage the encryption key for a solution that includes workloads for Compute Engine, Google Kubernetes Engine, Cloud Storage, BigQuery, and Pub/Sub. Which option should you choose for this implementation?

Options:

A.

Cloud External Key Manager

B.

Customer-managed encryption keys

C.

Customer-supplied encryption keys

D.

Google default encryption

Question 75

You are exporting application logs to Cloud Storage. You encounter an error message that the log sinks don't support uniform bucket-level access policies. How should you resolve this error?

Options:

A.

Change the access control model for the bucket

B.

Update your sink with the correct bucket destination.

C.

Add the roles/logging.logWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

D.

Add the roles/logging.bucketWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

Question 76

You want to update your existing VPC Service Controls perimeter with a new access level. You need to avoid breaking the existing perimeter with this change, and ensure the least disruptions to users while minimizing overhead. What should you do?

Options:

A.

Create an exact replica of your existing perimeter. Add your new access level to the replica. Update the original perimeter after the access level has been vetted.

B.

Update your perimeter with a new access level that never matches. Update the new access level to match your desired state one condition at a time to avoid being overly permissive.

C.

Enable the dry run mode on your perimeter. Add your new access level to the perimeter configuration. Update the perimeter configuration after the access level has been vetted.

D.

Enable the dry run mode on your perimeter. Add your new access level to the perimeter dry run configuration. Update the perimeter configuration after the access level has been vetted.

Question 77

Your organization wants to be continuously evaluated against CIS Google Cloud Computing Foundations Benchmark v1 3 0 (CIS Google Cloud Foundation 1 3). Some of the controls are irrelevant to your organization and must be disregarded in evaluation. You need to create an automated system or process to ensure that only the relevant controls are evaluated.

What should you do?

Options:

A.

Mark all security findings that are irrelevant with a tag and a value that indicates a security exception Select all marked findings and mute them on the console every time they appear Activate Security Command Center (SCC) Premium.

B.

Activate Security Command Center (SCC) Premium Create a rule to mute the security findings in SCC so they are not evaluated.

C.

Download all findings from Security Command Center (SCC) to a CSV file Mark the findings that are part of CIS Google Cloud Foundation 1 3 in the file Ignore the entries that are irrelevant and out of scope for the company.

D.

Ask an external audit company to provide independent reports including needed CIS benchmarks. In the scope of the audit clarify that some of the controls are not needed and must be disregarded.

Question 78

Your company's users access data in a BigQuery table. You want to ensure they can only access the data during working hours.

What should you do?

Options:

A.

Assign a BigQuery Data Viewer role along with an 1AM condition that limits the access to specified working hours.

B.

Configure Cloud Scheduler so that it triggers a Cloud Functions instance that modifies the organizational policy constraints for BigQuery during the specified working hours.

C.

Assign a BigQuery Data Viewer role to a service account that adds and removes the users daily during the specified working hours

D.

Run a gsuttl script that assigns a BigQuery Data Viewer role, and remove it only during the specified working hours.

Question 79

You have noticed an increased number of phishing attacks across your enterprise user accounts. You want to implement the Google 2-Step Verification (2SV) option that uses a cryptographic signature to authenticate a user and verify the URL of the login page. Which Google 2SV option should you use?

Options:

A.

Titan Security Keys

B.

Google prompt

C.

Google Authenticator app

D.

Cloud HSM keys

Question 80

Your organization recently activated the Security Command Center {SCO standard tier. There are a few Cloud Storage buckets that were accidentally made accessible to the public. You need to investigate the impact of the incident and remediate it.

What should you do?

Options:

A.

• 1 Remove the Identity and Access Management (IAM) granting access to allusers from the buckets• 2 Apply the organization policy storage. unifromBucketLevelAccess to prevent regressions• 3 Query the data access logs to report on unauthorized access

B.

• 1 Change bucket permissions to limit access• 2 Query the data access audit logs for any unauthorized access to the buckets• 3 After the misconfiguration is corrected mute the finding in the Security Command Center

C.

• 1 Change permissions to limit access for authorized users• 2 Enforce a VPC Service Controls perimeter around all the production projects to immediately stop any unauthorized access• 3 Review the administrator activity audit logs to report on any unauthorized access

D.

• 1 Change the bucket permissions to limit access• 2 Query the buckets usage logs to report on unauthorized access to the data• 3 Enforce the organization policy storage.publicAccessPrevention to avoid regressions

Question 81

An application running on a Compute Engine instance needs to read data from a Cloud Storage bucket. Your team does not allow Cloud Storage buckets to be globally readable and wants to ensure the principle of least privilege.

Which option meets the requirement of your team?

Options:

A.

Create a Cloud Storage ACL that allows read-only access from the Compute Engine instance’s IP address and allows the application to read from the bucket without credentials.

B.

Use a service account with read-only access to the Cloud Storage bucket, and store the credentials to the service account in the config of the application on the Compute Engine instance.

C.

Use a service account with read-only access to the Cloud Storage bucket to retrieve the credentials from the instance metadata.

D.

Encrypt the data in the Cloud Storage bucket using Cloud KMS, and allow the application to decrypt the data with the KMS key.

Question 82

You have just created a new log bucket to replace the _Default log bucket. You want to route all log entries that are currently routed to the _Default log bucket to this new log bucket in the most efficient manner. What should you do?​

Options:

A.

Create a user-defined sink with inclusion filters copied from the _Default sink. Select the new log bucket as the sink destination.​

B.

Create exclusion filters for the _Default sink to prevent it from receiving new logs. Create a user-defined sink, and select the new log bucket as the sink destination.​

C.

Disable the _Default sink. Create a user-defined sink and select the new log bucket as the sink destination.​

D.

Edit the _Default sink, and select the new log bucket as the sink destination.​

Question 83

You’re developing the incident response plan for your company. You need to define the access strategy that your DevOps team will use when reviewing and investigating a deployment issue in your Google Cloud environment. There are two main requirements:

    Least-privilege access must be enforced at all times.

    The DevOps team must be able to access the required resources only during the deployment issue.

How should you grant access while following Google-recommended best practices?

Options:

A.

Assign the Project Viewer Identity and Access Management (1AM) role to the DevOps team.

B.

Create a custom 1AM role with limited list/view permissions, and assign it to the DevOps team.

C.

Create a service account, and grant it the Project Owner 1AM role. Give the Service Account User Role on this service account to the DevOps team.

D.

Create a service account, and grant it limited list/view permissions. Give the Service Account User Role on this service account to the DevOps team.

Question 84

You have been tasked with configuring Security Command Center for your organization’s Google Cloud environment. Your security team needs to receive alerts of potential crypto mining in the organization’s compute environment and alerts for common Google Cloud misconfigurations that impact security. Which Security Command Center features should you use to configure these alerts? (Choose two.)

Options:

A.

Event Threat Detection

B.

Container Threat Detection

C.

Security Health Analytics

D.

Cloud Data Loss Prevention

E.

Google Cloud Armor

Question 85

Your organization has established a highly sensitive project within a VPC Service Controls perimeter. You need to ensure that only users meeting specific contextual requirements such as having a company-managed device, a specific location, and a valid user identity can access resources within this perimeter. You want to evaluate the impact of this change without blocking legitimate access. What should you do?

Options:

A.

Configure a VPC Service Controls perimeter in dry run mode, and enforce strict network segmentation using firewall rules. Use multi-factor authentication (MFA) for user verification.

B.

Use Cloud Audit Logs to monitor user access to the project resources. Use post-incident analysis to identify unauthorized access attempts.

C.

Establish a Context-Aware Access policy that specifies the required contextual attributes, and associate the policy with the VPC Service Controls perimeter in dry run mode.

D.

Use the VPC Service Control Violation dashboard to identify the impact of details about access denials by service perimeters.

Question 86

Your company uses Google Cloud and has publicly exposed network assets. You want to discover the assets and perform a security audit on these assets by using a software tool in the least amount of time.

What should you do?

Options:

A.

Run a platform security scanner on all instances in the organization.

B.

Notify Google about the pending audit and wait for confirmation before performing the scan.

C.

Contact a Google approved security vendor to perform the audit.

D.

Identify all external assets by using Cloud Asset Inventory and then run a network security scanner against them.

Question 87

Your team wants to centrally manage GCP IAM permissions from their on-premises Active Directory Service. Your team wants to manage permissions by AD group membership.

What should your team do to meet these requirements?

Options:

A.

Set up Cloud Directory Sync to sync groups, and set IAM permissions on the groups.

B.

Set up SAML 2.0 Single Sign-On (SSO), and assign IAM permissions to the groups.

C.

Use the Cloud Identity and Access Management API to create groups and IAM permissions from Active Directory.

D.

Use the Admin SDK to create groups and assign IAM permissions from Active Directory.

Question 88

A customer terminates an engineer and needs to make sure the engineer's Google account is automatically deprovisioned.

What should the customer do?

Options:

A.

Use the Cloud SDK with their directory service to remove their IAM permissions in Cloud Identity.

B.

Use the Cloud SDK with their directory service to provision and deprovision users from Cloud Identity.

C.

Configure Cloud Directory Sync with their directory service to provision and deprovision users from Cloud Identity.

D.

Configure Cloud Directory Sync with their directory service to remove their IAM permissions in Cloud Identity.

Question 89

You are deploying a web application hosted on Compute Engine. A business requirement mandates that application logs are preserved for 12 years and data is kept within European boundaries. You want to implement a storage solution that minimizes overhead and is cost-effective. What should you do?

Options:

A.

Create a Cloud Storage bucket to store your logs in the EUROPE-WEST1 region. Modify your application code to ship logs directly to your bucket for increased efficiency.

B.

Configure your Compute Engine instances to use the Google Cloud's operations suite Cloud Logging agent to send application logs to a custom log bucket in the EUROPE-WEST1 region with a custom retention of 12 years.

C.

Use a Pub/Sub topic to forward your application logs to a Cloud Storage bucket in the EUROPE-WEST1 region.

D.

Configure a custom retention policy of 12 years on your Google Cloud's operations suite log bucket in the EUROPE-WEST1 region.

Exam Detail
Vendor: Google
Certification: Google Cloud Certified
Last Update: Nov 16, 2025
Professional-Cloud-Security-Engineer Question Answers