Spring Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Free and Premium Google Professional-Cloud-Security-Engineer Dumps Questions Answers

Google Cloud Certified - Professional Cloud Security Engineer Questions and Answers

Question 1

While migrating your organization’s infrastructure to GCP, a large number of users will need to access GCP Console. The Identity Management team already has a well-established way to manage your users and want to keep using your existing Active Directory or LDAP server along with the existing SSO password.

What should you do?

Options:

A.

Manually synchronize the data in Google domain with your existing Active Directory or LDAP server.

B.

Use Google Cloud Directory Sync to synchronize the data in Google domain with your existing Active Directory or LDAP server.

C.

Users sign in directly to the GCP Console using the credentials from your on-premises Kerberos compliant identity provider.

D.

Users sign in using OpenID (OIDC) compatible IdP, receive an authentication token, then use that token to log in to the GCP Console.

Buy Now
Question 2

An organization’s typical network and security review consists of analyzing application transit routes, request handling, and firewall rules. They want to enable their developer teams to deploy new applications without the overhead of this full review.

How should you advise this organization?

Options:

A.

Use Forseti with Firewall filters to catch any unwanted configurations in production.

B.

Mandate use of infrastructure as code and provide static analysis in the CI/CD pipelines to enforce policies.

C.

Route all VPC traffic through customer-managed routers to detect malicious patterns in production.

D.

All production applications will run on-premises. Allow developers free rein in GCP as their dev and QA platforms.

Question 3

The CISO of your highly regulated organization has mandated that all AI applications running in production must be based on Google first-party models. Your security team has now implemented the Model Garden's organization policy meant to centrally control access and user actions on these approved models at the production folder level. However, it appears that someone has overwritten the policy. This has allowed developers to access third-party models on a particular production project. You need to resolve the issue with a solution that prevents a repeat occurrence. What should you do?

Options:

A.

Withdraw the Organization Policy Administrator role from all non-security team principals at the organization level.

B.

Withdraw the Organization Policy Administrator role from all non-security team principals at the production folder level.

C.

Implement a security posture based on the secure_ai_extended template to notify the security team of any policy changes at the organization level.

D.

Implement a security posture based on the secure_ai_extended template to notify the security team of any policy changes at the production folder level.

Question 4

A business unit at a multinational corporation signs up for GCP and starts moving workloads into GCP. The business unit creates a Cloud Identity domain with an organizational resource that has hundreds of projects.

Your team becomes aware of this and wants to take over managing permissions and auditing the domain resources.

Which type of access should your team grant to meet this requirement?

Options:

A.

Organization Administrator

B.

Security Reviewer

C.

Organization Role Administrator

D.

Organization Policy Administrator

Question 5

You are responsible for protecting highly sensitive data in BigQuery. Your operations teams need access to this data, but given privacy regulations, you want to ensure that they cannot read the sensitive fields such as email addresses and first names. These specific sensitive fields should only be available on a need-to-know basis to the HR team. What should you do?

Options:

A.

Perform data masking with the DLP API and store that data in BigQuery for later use.

B.

Perform data redaction with the DLP API and store that data in BigQuery for later use.

C.

Perform data inspection with the DLP API and store that data in BigQuery for later use.

D.

Perform tokenization for Pseudonymization with the DLP API and store that data in BigQuery for later use.

Question 6

Your organization develops software involved in many open source projects and is concerned about software supply chain threats You need to deliver provenance for the build to demonstrate the software is untampered.

What should you do?

Options:

A.

• 1- Generate Supply Chain Levels for Software Artifacts (SLSA) level 3 assurance by using Cloud Build.• 2. View the build provenance in the Security insights side panel within the Google Cloud console.

B.

• 1. Review the software process.• 2. Generate private and public key pairs and use Pretty Good Privacy (PGP) protocols to sign the output software artifacts together with a file containing the address of your enterprise and point of contact.• 3. Publish the PGP signed attestation to your public web page.

C.

• 1, Publish the software code on GitHub as open source.• 2. Establish a bug bounty program, and encourage the open source community to review, report, and fix the vulnerabilities.

D.

• 1. Hire an external auditor to review and provide provenance• 2. Define the scope and conditions.• 3. Get support from the Security department or representative.• 4. Publish the attestation to your public web page.

Question 7

Your Google Cloud environment has one organization node, one folder named Apps." and several projects within that folder The organizational node enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the terramearth.com organization The "Apps" folder enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the flowlogistic.com organization. It also has the inheritFromParent: false property.

You attempt to grant access to a project in the Apps folder to the user testuser@terramearth.com.

What is the result of your action and why?

Options:

A.

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy mustbe defined on the current project to deactivate the constraint temporarily.

B.

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy is in place and only members from the flowlogistic.com organization are allowed.

C.

The action succeeds because members from both organizations, terramearth. com or flowlogistic.com, are allowed on projects in the "Apps" folder

D.

The action succeeds and the new member is successfully added to the project's Identity and Access Management (1AM) policy because all policies are inherited by underlying folders and projects.

Question 8

You plan to use a Google Cloud Armor policy to prevent common attacks such as cross-site scripting (XSS) and SQL injection (SQLi) from reaching your web application's backend. What are two requirements for using Google Cloud Armor security policies? (Choose two.)

Options:

A.

The load balancer must be an external SSL proxy load balancer.

B.

Google Cloud Armor Policy rules can only match on Layer 7 (L7) attributes.

C.

The load balancer must use the Premium Network Service Tier.

D.

The backend service's load balancing scheme must be EXTERNAL.

E.

The load balancer must be an external HTTP(S) load balancer.

Question 9

A customer implements Cloud Identity-Aware Proxy for their ERP system hosted on Compute Engine. Their security team wants to add a security layer so that the ERP systems only accept traffic from Cloud Identity- Aware Proxy.

What should the customer do to meet these requirements?

Options:

A.

Make sure that the ERP system can validate the JWT assertion in the HTTP requests.

B.

Make sure that the ERP system can validate the identity headers in the HTTP requests.

C.

Make sure that the ERP system can validate the x-forwarded-for headers in the HTTP requests.

D.

Make sure that the ERP system can validate the user’s unique identifier headers in the HTTP requests.

Question 10

Your company must follow industry specific regulations. Therefore, you need to enforce customer-managed encryption keys (CMEK) for all new Cloud Storage resources in the organization called org1.

What command should you execute?

Options:

A.

• organization policy: constraints/gcp.restrictStorageNonCraekServices• binding at: orgl• policy type: deny• policy value: storage.gcogleapis.com

B.

• organization policy: constraints/gcp.restrictHonCmekServices• binding at: orgl• policy type: deny• policy value: storage.googleapis.com

C.

• organization policy:constraints/gcp.restrictStorageNonCraekServices• binding at: orgl• policy type: allow• policy value: all supported services

D.

• organization policy: constramts/gcp.restrictNonCmekServices• binding at: orgl• policy type: allow• policy value: storage.googleapis.com

Question 11

Your company’s detection and response team requires break-glass access to the Google Cloud organization in the event of a security investigation. At the end of each day, all security group membership is removed. You need to automate user provisioning to a Cloud Identity security group. You have created a service account to provision group memberships. Your solution must follow Google-recommended practices and comply with the principle of least privilege. What should you do?

Options:

A.

In Google Workspace, grant the service account client ID access to the scope, https://www.googleapis.com/auth/admin.directory.group, by using domain-wide delegation, and use a service account key.

B.

In Google Workspace, grant the service account client ID access to the scope, https://www.googleapis.com/auth/admin.directory.group, by using domain-wide delegation. Use Application Default Credentials with the resource-attached service account.

C.

In Google Workspace, grant the Groups Editor role to the service account. Enable the Cloud Identity API. Use a service account key.

D.

In Google Workspace, grant the Groups Editor role to the service account, enable the Cloud Identity API, and use Application Default Credentials with the resource-attached service account.

Question 12

You are a security engineer at a finance company. Your organization plans to store data on Google Cloud, but your leadership team is worried about the security of their highly sensitive data Specifically, your

company is concerned about internal Google employees' ability to access your company's data on Google Cloud. What solution should you propose?

Options:

A.

Use customer-managed encryption keys.

B.

Use Google's Identity and Access Management (IAM) service to manage access controls on Google Cloud.

C.

Enable Admin activity logs to monitor access to resources.

D.

Enable Access Transparency logs with Access Approval requests for Google employees.

Question 13

A patch for a vulnerability has been released, and a DevOps team needs to update their running containers in Google Kubernetes Engine (GKE).

How should the DevOps team accomplish this?

Options:

A.

Use Puppet or Chef to push out the patch to the running container.

B.

Verify that auto upgrade is enabled; if so, Google will upgrade the nodes in a GKE cluster.

C.

Update the application code or apply a patch, build a new image, and redeploy it.

D.

Configure containers to automatically upgrade when the base image is available in Container Registry.

Question 14

You are a member of the security team at an organization. Your team has a single GCP project with credit card payment processing systems alongside web applications and data processing systems. You want to reduce the scope of systems subject to PCI audit standards.

What should you do?

Options:

A.

Use multi-factor authentication for admin access to the web application.

B.

Use only applications certified compliant with PA-DSS.

C.

Move the cardholder data environment into a separate GCP project.

D.

Use VPN for all connections between your office and cloud environments.

Question 15

You need to set up two network segments: one with an untrusted subnet and the other with a trusted subnet. You want to configure a virtual appliance such as a next-generation firewall (NGFW) to inspect all traffic between the two network segments. How should you design the network to inspect the traffic?

Options:

A.

1. Set up one VPC with two subnets: one trusted and the other untrusted.2. Configure a custom route for all traffic (0.0.0.0/0) pointed to the virtual appliance.

B.

1. Set up one VPC with two subnets: one trusted and the other untrusted.2. Configure a custom route for all RFC1918 subnets pointed to the virtual appliance.

C.

1. Set up two VPC networks: one trusted and the other untrusted, and peer them together.2. Configure a custom route on each network pointed to the virtual appliance.

D.

1. Set up two VPC networks: one trusted and the other untrusted.2. Configure a virtual appliance using multiple network interfaces, with each interface connected to one of the VPC networks.

Question 16

You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides.

What should you do?

Options:

A.

Enable Access Transparency Logging.

B.

Deploy resources only to regions permitted by data residency requirements

C.

Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.

D.

Deploy Assured Workloads.

Question 17

Your organization operates in a highly regulated environment and has a stringent set of compliance requirements for protecting customer data. You must encrypt data while in use to meet regulations. What should you do?

Options:

A.

Use customer-managed encryption keys (CMEK) and Cloud KSM to enable your organization to control their keys for data encryption in Cloud SQL

B.

Enable the use of customer-supplied encryption keys (CSEK) keys in the Google Compute Engine VMs to give your organization maximum control over their VM disk encryption.

C.

Establish a trusted execution environment with a Confidential VM.

D.

Use a Shielded VM to ensure a secure boot with integrity monitoring for the application environment.

Question 18

How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system?

Options:

A.

Send all logs to the SIEM system via an existing protocol such as syslog.

B.

Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system.

C.

Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow.

D.

Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.

Question 19

Your organization relies heavily on virtual machines (VMs) in Compute Engine. Due to team growth and resource demands. VM sprawl is becoming problematic. Maintaining consistent security hardening and timely package updates poses an increasing challenge. You need to centralize VM image management and automate the enforcement of security baselines throughout the virtual machine lifecycle. What should you do?

Options:

A.

Activate Security Command Center Enterprise. Use VM discovery and posture management features to monitor hardening state and trigger automatic responses upon detection of issues.B. Create a Cloud Build trigger to build a pipeline that generates hardened VM images. Run vulnerability scans in the pipeline, and store images with passing scans in a registry. Use instance templates pointing to this registry.

B.

Configure the sole-tenancy feature in Compute Engine for all projects. Set up custom organization policies in Policy Controller to restrict the operating systems and image sources that teams are allowed to use.

C.

Use VM Manager to automatically distribute and apply patches to VMs across your projects. Integrate VM Manager with hardened. organization-standard VM images stored in a central repository.

Question 20

Your organization is developing a sophisticated machine learning (ML) model to predict customer behavior for targeted marketing campaigns. The BigQuery dataset used for training includes sensitive personal information. You must design the security controls around the AI/ML pipeline. Data privacy must be maintained throughout the model's lifecycle and you must ensure that personal data is not used in the training process Additionally, you must restrict access to the dataset to an authorized subset of people only. What should you do?

Options:

A.

Implement at-rest encryption by using customer-managed encryption keys (CMEK) for the pipeline. Implement strict Identity and Access Management (IAM) policies to control access to BigQuery.

B.

De-identify sensitive data before model training by using Cloud Data Loss Prevention (DLP) APIs, and implement strict Identity and Access Management (IAM) policies to control access to BigQuery.

C.

Implement Identity-Aware Proxy to enforce context-aware access to BigQuery and models based on user identity and device.

D.

Deploy the model on Confidential VMs for enhanced protection of data and code while in use. Implement strict Identity and Access Management (IAM) policies to control access to BigQuery.

Question 21

You need to enforce a security policy in your Google Cloud organization that prevents users from exposing objects in their buckets externally. There are currently no buckets in your organization. Which solution should you implement proactively to achieve this goal with the least operational overhead?

Options:

A.

Create an hourly cron job to run a Cloud Function that finds public buckets and makes them private.

B.

Enable the constraints/storage.publicAccessPrevention constraint at the organization level.

C.

Enable the constraints/storage.uniformBucketLevelAccess constraint at the organization level.

D.

Create a VPC Service Controls perimeter that protects the storage.googleapis.com service in your projects that contains buckets. Add any new project that contains a bucket to the perimeter.

Question 22

In a shared security responsibility model for IaaS, which two layers of the stack does the customer share responsibility for? (Choose two.)

Options:

A.

Hardware

B.

Network Security

C.

Storage Encryption

D.

Access Policies

E.

Boot

Question 23

Your organization must follow the Payment Card Industry Data Security Standard (PCI DSS). To prepare for an audit, you must detect deviations at an infrastructure-as-a-service level in your Google Cloud landing zone. What should you do?

Options:

A.

Create a data profile covering all payment-relevant data types. Configure Data Discovery and a risk analysis job in Google Cloud Sensitive Data Protection to analyze findings.​

B.

Use the Google Cloud Compliance Reports Manager to download the latest version of the PCI DSS report. Analyze the report to detect deviations.​

C.

Create an Assured Workloads folder in your Google Cloud organization. Migrate existing projects into the folder and monitor for deviations in the PCI DSS.​

D.

Activate Security Command Center Premium. Use the Compliance Monitoring product to filter findings that may not be PCI DSS compliant.​

Question 24

An organization receives an increasing number of phishing emails.

Which method should be used to protect employee credentials in this situation?

Options:

A.

Multifactor Authentication

B.

A strict password policy

C.

Captcha on login pages

D.

Encrypted emails

Question 25

You have the following resource hierarchy. There is an organization policy at each node in the hierarchy as shown. Which load balancer types are denied in VPC A?

Options:

A.

All load balancer types are denied in accordance with the global node’s policy.

B.

INTERNAL_TCP_UDP, INTERNAL_HTTP_HTTPS is denied in accordance with the folder’s policy.

C.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY are denied in accordance with the project’s policy.

D.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY, INTERNAL_TCP_UDP, and INTERNAL_HTTP_HTTPS are denied in accordance with the folder and project’s policies.

Question 26

Your organization uses BigQuery to process highly sensitive, structured datasets. Following the "need to know" principle, you need to create the Identity and Access Management (IAM) design to meet the needs of these users:

• Business user must access curated reports.

• Data engineer: must administrate the data lifecycle in the platform.

• Security operator: must review user activity on the data platform.

What should you do?

Options:

A.

Configure data access log for BigQuery services, and grant Project Viewer role to security operators.

B.

Generate a CSV data file based on the business user's needs, and send the data to their email addresses.

C.

Create curated tables in a separate dataset and assign the role roles/bigquery.dataViewer.

D.

Set row-based access control based on the "region" column, and filter the record from the United States for data engineers.

Question 27

You have been tasked with implementing external web application protection against common web application attacks for a public application on Google Cloud. You want to validate these policy changes before they are enforced. What service should you use?

Options:

A.

Google Cloud Armor's preconfigured rules in preview mode

B.

Prepopulated VPC firewall rules in monitor mode

C.

The inherent protections of Google Front End (GFE)

D.

Cloud Load Balancing firewall rules

E.

VPC Service Controls in dry run mode

Question 28

An organization is evaluating the use of Google Cloud Platform (GCP) for certain IT workloads. A well- established directory service is used to manage user identities and lifecycle management. This directory service must continue for the organization to use as the “source of truth” directory for identities.

Which solution meets the organization's requirements?

Options:

A.

Google Cloud Directory Sync (GCDS)

B.

Cloud Identity

C.

Security Assertion Markup Language (SAML)

D.

Pub/Sub

Question 29

You have stored company approved compute images in a single Google Cloud project that is used as an image repository. This project is protected with VPC Service Controls and exists in the perimeter along with other projects in your organization. This lets other projects deploy images from the image repository project. A team requires deploying a third-party disk image that is stored in an external Google Cloud organization. You need to grant read access to the disk image so that it can be deployed into the perimeter.

What should you do?

Options:

A.

• 1 Update the perimeter• 2 Configure the egressTo field to set identity Type to any_identity.• 3 Configure the egressFrom field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.

B.

* Allow the external project by using the organizational policyconstraints/compute.trustedlmageProjects.

C.

• 1 Update the perimeter• 2 Configure the egressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.• 3 Configure the egressFrom field to set identity Type to any_idestity.

D.

• 1 Update the perimeter• 2 Configure the ingressFrcm field to set identityType to an-y_identity.• 3 Configure the ingressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute.googleapis -com.

Question 30

Your company has multiple teams needing access to specific datasets across various Google Cloud data services for different projects. You need to ensure that team members can only access the data relevant to their projects and prevent unauthorized access to sensitive information within BigQuery, Cloud Storage, and Cloud SQL. What should you do?

Options:

A.

Grant project-level group permissions by using specific Cloud IAM roles. Use BigQuery authorized views. Cloud Storage uniform bucket-level access, and Cloud SQL database roles.

B.

Configure an access level to control access to the Google Cloud console for users managing these data services. Require multi-factor authentication for all access attempts.

C.

Use VPC Service Controls to create security perimeters around the projects for BigQuery. Cloud Storage, and Cloud SQL services. restricting access based on the network origin of the requests.

D.

Enable project-level data access logs for BigQuery. Cloud Storage, and Cloud SQL. Configure log sinks to export these logs to Security Command Center to identify unauthorized access attempts.

Question 31

You are designing a new governance model for your organization's secrets that are stored in Secret Manager. Currently, secrets for Production and Non-Production applications are stored and accessed using service accounts. Your proposed solution must:

Provide granular access to secrets

Give you control over the rotation schedules for the encryption keys that wrap your secrets

Maintain environment separation

Provide ease of management

Which approach should you take?

Options:

A.

1. Use separate Google Cloud projects to store Production and Non-Production secrets.2. Enforce access control to secrets using project-level identity and Access Management (IAM) bindings.3. Use customer-managed encryption keys to encrypt secrets.

B.

1. Use a single Google Cloud project to store both Production and Non-Production secrets.2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.3. Use Google-managed encryption keys to encrypt secrets.

C.

1. Use separate Google Cloud projects to store Production and Non-Production secrets.2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.3. Use Google-managed encryption keys to encrypt secrets.

D.

1. Use a single Google Cloud project to store both Production and Non-Production secrets.2. Enforce access control to secrets using project-level Identity and Access Management (IAM) bindings.3. Use customer-managed encryption keys to encrypt secrets.

Question 32

You are the Security Admin in your company. You want to synchronize all security groups that have an email address from your LDAP directory in Cloud IAM.

What should you do?

Options:

A.

Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have “user email address” as the attribute to facilitate one-way sync.

B.

Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have “user email address” as the attribute to facilitate bidirectional sync.

C.

Use a management tool to sync the subset based on the email address attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google Cloud Identity and Access Management (IAM) role.

D.

Use a management tool to sync the subset based on group object class attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google Cloud Identity and Access Management (IAM) role.

Question 33

You are working with protected health information (PHI) for an electronic health record system. The privacy officer is concerned that sensitive data is stored in the analytics system. You are tasked with anonymizing the sensitive data in a way that is not reversible. Also, the anonymized data should not preserve the character set and length. Which Google Cloud solution should you use?

Options:

A.

Cloud Data Loss Prevention with deterministic encryption using AES-SIV

B.

Cloud Data Loss Prevention with format-preserving encryption

C.

Cloud Data Loss Prevention with cryptographic hashing

D.

Cloud Data Loss Prevention with Cloud Key Management Service wrapped cryptographic keys

Question 34

Your financial services company is migrating its operations to Google Cloud. You are implementing a centralized logging strategy to meet strict regulatory compliance requirements. Your company's Google Cloud organization has a dedicated folder for all production projects. All audit logs, including Data Access logs from all current and future projects within this production folder, must be securely collected and stored in a central BigQuery dataset for long-term retention and analysis. To prevent duplicate log storage and to enforce centralized control, you need to implement a logging solution that intercepts and overrides any project-level log sinks for these audit logs, to ensure that logs are not inadvertently routed elsewhere. What should you do?

Options:

A.

Create an aggregated log sink at the production folder level with a destination of the central BigQuery dataset. Configure an inclusion filter for all audit and Data Access logs. Grant the Logs Bucket Writer role to the sink's service account on the production folder.

B.

Create a log sink in each production project to route audit logs to the central BigQuery dataset. Set the writer_identity field of each sink to a service account with BigQuery Data Editor permissions on the central dataset.

C.

Create an aggregated log sink at the organization level with a destination of the central BigQuery dataset and a filter for all audit logs. Use the --include-children flag and configure a log view for the production folder.

D.

Create an intercepting aggregated log sink at the production folder level with the central BigQuery dataset as the destination. Configure an inclusion filter for the necessary audit logs. Grant the appropriate IAM permissions to the sink's writer_identity on the BigQuery dataset.

Question 35

A customer terminates an engineer and needs to make sure the engineer's Google account is automatically deprovisioned.

What should the customer do?

Options:

A.

Use the Cloud SDK with their directory service to remove their IAM permissions in Cloud Identity.

B.

Use the Cloud SDK with their directory service to provision and deprovision users from Cloud Identity.

C.

Configure Cloud Directory Sync with their directory service to provision and deprovision users from Cloud Identity.

D.

Configure Cloud Directory Sync with their directory service to remove their IAM permissions in Cloud Identity.

Question 36

You have noticed an increased number of phishing attacks across your enterprise user accounts. You want to implement the Google 2-Step Verification (2SV) option that uses a cryptographic signature to authenticate a user and verify the URL of the login page. Which Google 2SV option should you use?

Options:

A.

Titan Security Keys

B.

Google prompt

C.

Google Authenticator app

D.

Cloud HSM keys

Question 37

You are working with a client that is concerned about control of their encryption keys for sensitive data. The client does not want to store encryption keys at rest in the same cloud service provider (CSP) as the data that the keys are encrypting. Which Google Cloud encryption solutions should you recommend to this client? (Choose two.)

Options:

A.

Customer-supplied encryption keys.

B.

Google default encryption

C.

Secret Manager

D.

Cloud External Key Manager

E.

Customer-managed encryption keys

Question 38

Which Google Cloud service should you use to enforce access control policies for applications and resources?

Options:

A.

Identity-Aware Proxy

B.

Cloud NAT

C.

Google Cloud Armor

D.

Shielded VMs

Question 39

You are using Security Command Center (SCC) to protect your workloads and receive alerts for suspected security breaches at your company. You need to detect cryptocurrency mining software. Which SCC service should you use?

Options:

A.

Web Security Scanner

B.

Container Threat Detection

C.

Rapid Vulnerability Detection

D.

Virtual Machine Threat Detection

Question 40

You need to use Cloud External Key Manager to create an encryption key to encrypt specific BigQuery data at rest in Google Cloud. Which steps should you do first?

Options:

A.

1. Create or use an existing key with a unique uniform resource identifier (URI) in your Google Cloud project.2. Grant your Google Cloud project access to a supported external key management partner system.

B.

1. Create or use an existing key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).2. In Cloud KMS, grant your Google Cloud project access to use the key.

C.

1. Create or use an existing key with a unique uniform resource identifier (URI) in a supported external key management partner system.2. In the external key management partner system, grant access for this key to use your Google Cloud project.

D.

1. Create an external key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).2. In Cloud KMS, grant your Google Cloud project access to use the key.

Question 41

Your Google Cloud organization allows for administrative capabilities to be distributed to each team through provision of a Google Cloud project with Owner role (roles/ owner). The organization contains thousands of Google Cloud Projects Security Command Center Premium has surfaced multiple cpen_myscl_port findings. You are enforcing the guardrails and need to prevent these types of common misconfigurations.

What should you do?

Options:

A.

Create a firewall rule for each virtual private cloud (VPC) to deny traffic from 0 0 0 0/0 with priority 0.

B.

Create a hierarchical firewall policy configured at the organization to deny all connections from 0 0 0 0/0.

C.

Create a Google Cloud Armor security policy to deny traffic from 0 0 0 0/0.

D.

Create a hierarchical firewall policy configured at the organization to allow connections only from internal IP ranges

Question 42

Your company is migrating a customer database that contains personally identifiable information (PII) to Google Cloud. To prevent accidental exposure, this data must be protected at rest. You need to ensure that all PII is automatically discovered and redacted, or pseudonymized, before any type of analysis. What should you do?

Options:

A.

Implement Cloud Armor to protect the database from external threats and configure firewall rules to restrict network access to only authorized internal IP addresses.

B.

Configure Sensitive Data Protection to scan the database for PII using both predefined and custom infoTypes and to mask sensitive data.8

C.

Use Cloud KMS to encrypt the database at rest with a customer-managed encryption key (CMEK). Implement VPC Service Controls.

D.

Create Cloud Storage buckets with object versioning enabled, and use IAM policies to restrict access to the data. Use Data Loss Prevention API (DLP API) on the buckets to scan for sensitive data and generate detection alerts.9

Question 43

Your organization is worried about recent news headlines regarding application vulnerabilities in production applications that have led to security breaches. You want to automatically scan your deployment pipeline for vulnerabilities and ensure only scanned and verified containers can run in the environment. What should you do?

Options:

A.

Enable Binary Authorization and create attestations of scans.

B.

Use gcloud artifacts docker images describe LOCATION-docker.pkg.dev/PROJECT_ID/REPOSITORY/IMAGE_ID@sha256:HASH --show-package-vulnerability in your CI/CD pipeline, and trigger a pipeline failure for critical vulnerabilities.

C.

Use Kubernetes role-based access control (RBAC) as the source of truth for cluster access by granting "container clusters.get" to limited users. Restrict deployment access by allowing these users to generate a kubeconfig file containing the configuration access to the GKE cluster.

D.

Enforce the use of Cloud Code for development so users receive real-time security feedback on vulnerable libraries and dependencies before they check in their code.

Question 44

Your company's storage team manages all product images within a specific Google Cloud project. To maintain control, you must isolate access to Cloud Storage for this project, allowing the storage team to manage restrictions at the project level. They must be restricted to using corporate computers. What should you do?

Options:

A.

Employ organization-level firewall rules to block all traffic to Cloud Storage. Create exceptions for specific service accounts used by the storage team within their project.

B.

Implement VPC Service Controls by establishing an organization-wide service perimeter with all projects. Configure ingress and egress rules to restrict access to Cloud Storage based on IP address ranges.

C.

Use Context-Aware Access. Create an access level that defines the required context. Apply it as an organization policy specifically at the project level, restricting access to Cloud Storage based on that context.

D.

Use Identity and Access Management (IAM) roles at the project level within the storage team's project. Grant the storage team granular permissions on the project's Cloud Storage resources.

Question 45

You are part of a security team that wants to ensure that a Cloud Storage bucket in Project A can only be readable from Project B. You also want to ensure that data in the Cloud Storage bucket cannot be accessed from or copied to Cloud Storage buckets outside the network, even if the user has the correct credentials.

What should you do?

Options:

A.

Enable VPC Service Controls, create a perimeter with Project A and B, and include Cloud Storage service.

B.

Enable Domain Restricted Sharing Organization Policy and Bucket Policy Only on the Cloud Storage bucket.

C.

Enable Private Access in Project A and B networks with strict firewall rules to allow communication between the networks.

D.

Enable VPC Peering between Project A and B networks with strict firewall rules to allow communication between the networks.

Question 46

Your organization needs to restrict the types of Google Cloud services that can be deployed within specific folders to enforce compliance requirements. You must apply these restrictions only to the designated folders, without affecting other parts of the resource hierarchy. You want to use the most efficient and simple method. What should you do?

Options:

A.

Implement IAM conditions on service account creation within each folder.

B.

Create a global organization policy at the organization level with the Restrict Resource Service Usage constraint, and apply exceptions for other folders.

C.

Create an organization policy at the folder level using the Restrict Resource Service Usage constraint, and define the allowed services per folder.

D.

Configure VPC Service Controls perimeters around each folder, and define the allowed services within the perimeter.

Question 47

You are asked to recommend a solution to store and retrieve sensitive configuration data from an application that runs on Compute Engine. Which option should you recommend?

Options:

A.

Cloud Key Management Service

B.

Compute Engine guest attributes

C.

Compute Engine custom metadata

D.

Secret Manager

Question 48

Your organization wants full control of the keys used to encrypt data at rest in their Google Cloud environments. Keys must be generated and stored outside of Google and integrate with many Google Services including BigQuery.

What should you do?

Options:

A.

Create a Cloud Key Management Service (KMS) key with imported key material Wrap the key for protection during import. Import the key generated on a trusted system in Cloud KMS.

B.

Create a KMS key that is stored on a Google managed FIPS 140-2 level 3 Hardware Security Module (HSM) Manage the Identity and Access Management (IAM) permissions settings, and set up the key rotation period.

C.

Use Cloud External Key Management (EKM) that integrates with an external Hardware Security Module(HSM) system from supported vendors.

D.

Use customer-supplied encryption keys (CSEK) with keys generated on trusted external systems Provide the raw CSEK as part of the API call.

Question 49

You must ensure that the keys used for at-rest encryption of your data are compliant with your organization's security controls. One security control mandates that keys get rotated every 90 days. You must implement an effective detection strategy to validate if keys are rotated as required. What should you do?​

Options:

A.

Analyze the crypto key versions of the keys by using data from Cloud Asset Inventory. If an active key is older than 90 days, send an alert message through your incident notification channel.​

B.

Identify keys that have not been rotated by using Security Health Analytics. If a key is not rotated after 90 days, a finding in Security Command Center is raised.​

C.

Assess the keys in the Cloud Key Management Service by implementing code in Cloud Run. If a key is not rotated after 90 days, raise a finding in Security Command Center.​

D.

Define a metric that checks for timely key updates by using Cloud Logging. If a key is not rotated after 90 days, send an alert message through your incident notification channel.​

Question 50

You are consulting with a client that requires end-to-end encryption of application data (including data in transit, data in use, and data at rest) within Google Cloud. Which options should you utilize to accomplish this? (Choose two.)

Options:

A.

External Key Manager

B.

Customer-supplied encryption keys

C.

Hardware Security Module

D.

Confidential Computing and Istio

E.

Client-side encryption

Question 51

You work for a large organization where each business unit has thousands of users. You need to delegate management of access control permissions to each business unit. You have the following requirements:

Each business unit manages access controls for their own projects.

Each business unit manages access control permissions at scale.

Business units cannot access other business units' projects.

Users lose their access if they move to a different business unit or leave the company.

Users and access control permissions are managed by the on-premises directory service.

What should you do? (Choose two.)

Options:

A.

Use VPC Service Controls to create perimeters around each business unit's project.

B.

Organize projects in folders, and assign permissions to Google groups at the folder level.

C.

Group business units based on Organization Units (OUs) and manage permissions based on OUs.

D.

Create a project naming convention, and use Google's IAM Conditions to manage access based on the prefix of project names.

E.

Use Google Cloud Directory Sync to synchronize users and group memberships in Cloud Identity.

Question 52

You need to connect your organization's on-premises network with an existing Google Cloud environment that includes one Shared VPC with two subnets named Production and Non-Production. You are required to:

Use a private transport link.

Configure access to Google Cloud APIs through private API endpoints originating from on-premises environments.

Ensure that Google Cloud APIs are only consumed via VPC Service Controls.

What should you do?

Options:

A.

1. Set up a Cloud VPN link between the on-premises environment and Google Cloud.2. Configure private access using the restricted googleapis.com domains in on-premises DNS configurations.

B.

1. Set up a Partner Interconnect link between the on-premises environment and Google Cloud.2. Configure private access using the private.googleapis.com domains in on-premises DNS configurations.

C.

1. Set up a Direct Peering link between the on-premises environment and Google Cloud.2. Configure private access for both VPC subnets.

D.

1. Set up a Dedicated Interconnect link between the on-premises environment and Google Cloud.2. Configure private access using the restricted.googleapis.com domains in on-premises DNS configurations.

Question 53

You control network traffic for a folder in your Google Cloud environment. Your folder includes multiple projects and Virtual Private Cloud (VPC) networks You want to enforce on the folder level that egress connections are limited only to IP range 10.58.5.0/24 and only from the VPC network dev-vpc." You want to minimize implementation and maintenance effort

What should you do?

Options:

A.

• 1. Attach external IP addresses to the VMs in scope.• 2. Configure a VPC Firewall rule in "dev-vpc" that allows egress connectivity to IP range 10.58.5.0/24 for all source addresses in this network.

B.

• 1. Attach external IP addresses to the VMs in scope.• 2. Define and apply a hierarchical firewall policy on folder level to deny all egress connections and to allow egress to IP range 10 58.5.0/24 from network dev-vpc.

C.

• 1. Leave the network configuration of the VMs in scope unchanged.• 2. Create a new project including a new VPC network "new-vpc."• 3 Deploy a network appliance in "new-vpc" to filter access requests and only allow egress connections from -dev-vpc" to 10.58.5.0/24.

D.

• 1 Leave the network configuration of the VMs in scope unchanged• 2 Enable Cloud NAT for dev-vpc" and restrict the target range in Cloud NAT to 10.58.5 0/24.

Question 54

Your organization operates a hybrid cloud environment and has recently deployed a private Artifact Registry repository in Google Cloud. On-premises developers cannot resolve the Artifact Registry hostname and therefore cannot push or pull artifacts. You've verified the following:

Connectivity to Google Cloud is established by Cloud VPN or Cloud Interconnect.

No custom DNS configurations exist on-premises.

There is no route to the internet from the on-premises network.

You need to identify the cause and enable the developers to push and pull artifacts. What is likely causing the issue and what should you do to fix the issue?

Options:

A.

Artifact Registry requires external HTTP/HTTPS access. Create a new firewall rule allowing ingress traffic on ports 80 and 443 from the developer's IP ranges.

B.

Private Google Access is not enabled for the subnet hosting the Artifact Registry. Enable Private Google Access for the appropriate subnet.

C.

On-premises DNS servers lack the necessary records to resolve private Google API domains. Create DNS records for restricted.googleapis.com or private.googleapis.com pointing to Google's published IP ranges.

D.

Developers must be granted the artifactregistry.writer IAM role. Grant the relevant developer group this role.

Question 55

What are the steps to encrypt data using envelope encryption?

Options:

A.

Generate a data encryption key (DEK) locally.Use a key encryption key (KEK) to wrap the DEK. Encrypt data with the KEK.Store the encrypted data and the wrapped KEK.

B.

Generate a key encryption key (KEK) locally.Use the KEK to generate a data encryption key (DEK). Encrypt data with the DEK.Store the encrypted data and the wrapped DEK.

C.

Generate a data encryption key (DEK) locally.Encrypt data with the DEK.Use a key encryption key (KEK) to wrap the DEK. Store the encrypted data and the wrapped DEK.

D.

Generate a key encryption key (KEK) locally.Generate a data encryption key (DEK) locally. Encrypt data with the KEK.Store the encrypted data and the wrapped DEK.

Question 56

A customer needs to prevent attackers from hijacking their domain/IP and redirecting users to a malicious site through a man-in-the-middle attack.

Which solution should this customer use?

Options:

A.

VPC Flow Logs

B.

Cloud Armor

C.

DNS Security Extensions

D.

Cloud Identity-Aware Proxy

Question 57

Your company has deployed an artificial intelligence model in a central project. This model has a lot of sensitive intellectual property and must be kept strictly isolated from the internet. You must expose the model endpoint only to a defined list of projects in your organization. What should you do?

Options:

A.

Within the model project, create an external Application Load Balancer that points to the model endpoint. Create a Cloud Armor policy to restrict IP addresses to Google Cloud.B. Within the model project, create an internal Application Load Balancer that points to the model endpoint. Expose this load balancer with Private Service Connect to a configured list of projects.

B.

Activate Private Google Access in both the model project and in each project that needs to connect to the model. Create a firewall policy to allow connectivity to Private Google Access addresses.

C.

Create a central project to host Shared VPC networks that are provided to all other projects. Centrally administer all firewall rules in this project to grant access to the model.

Question 58

Your organization wants to be continuously evaluated against CIS Google Cloud Computing Foundations Benchmark v1 3 0 (CIS Google Cloud Foundation 1 3). Some of the controls are irrelevant to your organization and must be disregarded in evaluation. You need to create an automated system or process to ensure that only the relevant controls are evaluated.

What should you do?

Options:

A.

Mark all security findings that are irrelevant with a tag and a value that indicates a security exception Select all marked findings and mute them on the console every time they appear Activate Security Command Center (SCC) Premium.

B.

Activate Security Command Center (SCC) Premium Create a rule to mute the security findings in SCC so they are not evaluated.

C.

Download all findings from Security Command Center (SCC) to a CSV file Mark the findings that are part of CIS Google Cloud Foundation 1 3 in the file Ignore the entries that are irrelevant and out of scope for the company.

D.

Ask an external audit company to provide independent reports including needed CIS benchmarks. In the scope of the audit clarify that some of the controls are not needed and must be disregarded.

Question 59

You are working with a client who plans to migrate their data to Google Cloud. You are responsible for recommending an encryption service to manage their encrypted keys. You have the following requirements:

The master key must be rotated at least once every 45 days.

The solution that stores the master key must be FIPS 140-2 Level 3 validated.

The master key must be stored in multiple regions within the US for redundancy.

Which solution meets these requirements?

Options:

A.

Customer-managed encryption keys with Cloud Key Management Service

B.

Customer-managed encryption keys with Cloud HSM

C.

Customer-supplied encryption keys

D.

Google-managed encryption keys

Question 60

Your organization has on-premises hosts that need to access Google Cloud APIs You must enforce private connectivity between these hosts minimize costs and optimize for operational efficiency

What should you do?

Options:

A.

Route all on-premises traffic to Google Cloud through an IPsec VPN tunnel to a VPC with Private Google Access enabled.

B.

Set up VPC peering between the hosts on-premises and the VPC through the internet.

C.

Enforce a security policy that mandates all applications to encrypt data with a Cloud Key Management. Service (KMS) key before you send it over the network.

D.

Route all on-premises traffic to Google Cloud through a dedicated or Partner interconnect to a VPC with Private Google Access enabled.

Question 61

Options:

A.

Implement a Cloud Function that scans the environment variables multiple times a day. and creates a finding in Security Command Center if secrets are discovered.

B.

Implement regular peer reviews to assess the environment variables and identify secrets in your Cloud Functions. Raise a security incident if secrets are discovered.

C.

Use Sensitive Data Protection to scan the environment variables multiple times per day. and create a finding in Security Command Center if secrets are discovered.

D.

Integrate dynamic application security testing into the CI/CD pipeline that scans the application code for the Cloud Functions. Fail the build process if secrets are discovered.

Question 62

Your organization deploys a large number of containerized applications on Google Kubernetes Engine (GKE). Node updates are currently applied manually. Audit findings show that a critical patch has not been installed due to a missed notification. You need to design a more reliable, cloud-first, and scalable process for node updates. What should you do?​

Options:

A.

Migrate the cluster infrastructure to a self-managed Kubernetes environment for greater control over the patching process.​

B.

Develop a custom script to continuously check for patch availability, download patches, and apply the patches across all components of the cluster.​

C.

Schedule a daily reboot for all nodes to automatically upgrade.​

D.

Configure node auto-upgrades for node pools in the maintenance windows.​

Question 63

A security audit uncovered several inconsistencies in your project’s Identity and Access Management (IAM) configuration. Some service accounts have overly permissive roles, and a few external collaborators have more access than necessary. You need to gain detailed visibility into changes to IAM policies, user activity, service account behavior, and access to sensitive projects. What should you do?

Options:

A.

Deploy the OS Config Management agent to your VMs. Use OS Config Management to create patch management jobs and monitor system modifications.

B.

Enable the metrics explorer in Cloud Monitoring to follow the service account authentication events and build alerts linked on it.

C.

Use Cloud Audit Logs. Create log export sinks to send these logs to a security information and event management (SIEM) solution for correlation with other event sources.

D.

Configure Google Cloud Functions to be triggered by changes to IAM policies. Analyze changes by using the policy simulator, send alerts upon risky modifications, and store event details.

Question 64

An organization's security and risk management teams are concerned about where their responsibility lies for certain production workloads they are running in Google Cloud Platform (GCP), and where Google's responsibility lies. They are mostly running workloads using Google Cloud's Platform-as-a-Service (PaaS) offerings, including App Engine primarily.

Which one of these areas in the technology stack would they need to focus on as their primary responsibility when using App Engine?

Options:

A.

Configuring and monitoring VPC Flow Logs

B.

Defending against XSS and SQLi attacks

C.

Manage the latest updates and security patches for the Guest OS

D.

Encrypting all stored data

Question 65

You are running a workload which processes very sensitive data that is intended to be used downstream by data scientists to train further models. The security team has very strict requirements around data handling and encryption, approved workloads, as well as separation of duties for the users of the output of the workload. You need to build the environment to support these requirements. What should you do?

Options:

A.

Use Confidential Computing on an N2D VM instance to process that data and output the results to a CMEK encrypted Cloud Storage bucket. Assign a storage object reader role to the data scientist service account. Manage access to this service account by using Workload Identity pools.

B.

Use Confidential Computing within Confidential Space, assign workload operator roles to the confidential compute VM service account. Assign the data collaborator role to the data scientist service account. Manage user access to these service accounts by using attestations and Workload Identity pools.

C.

Use Dataflow with Confidential Computing enabled to process the data and stream the results to a CMEK encrypted Cloud Storage bucket. Assign a storage object viewer role to the data scientist service account. Manage access to this service account by using Workload Identity pools.

D.

Use Dataproc with Confidential Computing enabled to process the data and stream the results to a CMEK encrypted Cloud Storage bucket. Assign a storage object reader role to the data scientist service account. Manage access to this service account by using Workload Identity pools.

Question 66

A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.

What should you do?

Options:

A.

Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.

B.

Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create ajob trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.

C.

On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.

D.

On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.

Question 67

Your security team wants to implement a defense-in-depth approach to protect sensitive data stored in a Cloud Storage bucket. Your team has the following requirements:

The Cloud Storage bucket in Project A can only be readable from Project B.

The Cloud Storage bucket in Project A cannot be accessed from outside the network.

Data in the Cloud Storage bucket cannot be copied to an external Cloud Storage bucket.

What should the security team do?

Options:

A.

Enable domain restricted sharing in an organization policy, and enable uniform bucket-level access on the Cloud Storage bucket.

B.

Enable VPC Service Controls, create a perimeter around Projects A and B. and include the Cloud Storage API in the Service Perimeter configuration.

C.

Enable Private Access in both Project A and B's networks with strict firewall rules that allow communication between the networks.

D.

Enable VPC Peering between Project A and B's networks with strict firewall rules that allow communication between the networks.

Question 68

A company migrated their entire data/center to Google Cloud Platform. It is running thousands of instances across multiple projects managed by different departments. You want to have a historical record of what was running in Google Cloud Platform at any point in time.

What should you do?

Options:

A.

Use Resource Manager on the organization level.

B.

Use Forseti Security to automate inventory snapshots.

C.

Use Stackdriver to create a dashboard across all projects.

D.

Use Security Command Center to view all assets across the organization.

Question 69

Your organization enforces a custom organization policy that disables the use of Compute Engine VM instances with external IP addresses.1 However, a regulated business unit requires an exception to temporarily use external IPs for a third-party audit process. The regulated business workload must comply with least privilege principles and minimize policy drift. You need to ensure secure policy management and proper handling. What should you do?

Options:

A.

Apply the restrictive organization policy at the organization level. Create an IAM custom role with permissions to bypass organization policies. Assign the custom role to the regulated business team for the specific project.

B.

Modify the custom organization policy at the organization level to allow external IPs for all projects. Configure VPC firewall rules to restrict egress traffic except for the regulated business workload.

C.

Apply the custom organization policy at the organization level to restrict external IPs. Move the regulated business workload to a separate folder. Override the policy at that folder level.

D.

Create a folder. Apply the restrictive organization policy for non-regulated business workloads in the folder. Place the regulated business workload in that folder.

Question 70

Your company’s cloud security policy dictates that VM instances should not have an external IP address. You need to identify the Google Cloud service that will allow VM instances without external IP addresses to connect to the internet to update the VMs. Which service should you use?

Options:

A.

Identity Aware-Proxy

B.

Cloud NAT

C.

TCP/UDP Load Balancing

D.

Cloud DNS

Question 71

A website design company recently migrated all customer sites to App Engine. Some sites are still in progress and should only be visible to customers and company employees from any location.

Which solution will restrict access to the in-progress sites?

Options:

A.

Upload an .htaccess file containing the customer and employee user accounts to App Engine.

B.

Create an App Engine firewall rule that allows access from the customer and employee networks and denies all other traffic.

C.

Enable Cloud Identity-Aware Proxy (IAP), and allow access to a Google Group that contains the customer and employee user accounts.

D.

Use Cloud VPN to create a VPN connection between the relevant on-premises networks and the company’s GCP Virtual Private Cloud (VPC) network.

Question 72

Your company has been creating users manually in Cloud Identity to provide access to Google Cloud resources. Due to continued growth of the environment, you want to authorize the Google Cloud Directory Sync (GCDS) instance and integrate it with your on-premises LDAP server to onboard hundreds of users. You are required to:

Replicate user and group lifecycle changes from the on-premises LDAP server in Cloud Identity.

Disable any manually created users in Cloud Identity.

You have already configured the LDAP search attributes to include the users and security groups in scope for Google Cloud. What should you do next to complete this solution?

Options:

A.

1. Configure the option to suspend domain users not found in LDAP.2. Set up a recurring GCDS task.

B.

1. Configure the option to delete domain users not found in LDAP.2. Run GCDS after user and group lifecycle changes.

C.

1. Configure the LDAP search attributes to exclude manually created Cloud Identity users not found in LDAP.2. Set up a recurring GCDS task.

D.

1. Configure the LDAP search attributes to exclude manually created Cloud identity users not found in LDAP.2. Run GCDS after user and group lifecycle changes.

Question 73

Your team needs to make sure that their backend database can only be accessed by the frontend application and no other instances on the network.

How should your team design this network?

Options:

A.

Create an ingress firewall rule to allow access only from the application to the database using firewall tags.

B.

Create a different subnet for the frontend application and database to ensure network isolation.

C.

Create two VPC networks, and connect the two networks using Cloud VPN gateways to ensure network isolation.

D.

Create two VPC networks, and connect the two networks using VPC peering to ensure network isolation.

Question 74

Your company is using GSuite and has developed an application meant for internal usage on Google App Engine. You need to make sure that an external user cannot gain access to the application even when an employee’s password has been compromised.

What should you do?

Options:

A.

Enforce 2-factor authentication in GSuite for all users.

B.

Configure Cloud Identity-Aware Proxy for the App Engine Application.

C.

Provision user passwords using GSuite Password Sync.

D.

Configure Cloud VPN between your private network and GCP.

Question 75

In order to meet PCI DSS requirements, a customer wants to ensure that all outbound traffic is authorized.

Which two cloud offerings meet this requirement without additional compensating controls? (Choose two.)

Options:

A.

App Engine

B.

Cloud Functions

C.

Compute Engine

D.

Google Kubernetes Engine

E.

Cloud Storage

Question 76

You are creating a new infrastructure CI/CD pipeline to deploy hundreds of ephemeral projects in your Google Cloud organization to enable your users to interact with Google Cloud. You want to restrict the use of the default networks in your organization while following Google-recommended best practices. What should you do?

Options:

A.

Enable the constraints/compute.skipDefaultNetworkCreation organization policy constraint at the organization level.

B.

Create a cron job to trigger a daily Cloud Function to automatically delete all default networks for each project.

C.

Grant your users the 1AM Owner role at the organization level. Create a VPC Service Controls perimeter around the project that restricts the compute.googleapis.com API.

D.

Only allow your users to use your CI/CD pipeline with a predefined set of infrastructure templates they can deploy to skip the creation of the default networks.

Question 77

Your organization is using Google Workspace, Google Cloud, and a third-party SIEM. You need to export events such as user logins, successful logins, and failed logins to the SIEM. Logs need to be ingested in real time or near real-time. What should you do?

Options:

A.

Create a Cloud Storage bucket as a sink for all logs. Configure the SIEM to periodically scan the bucket for new log files.

B.

Create a Cloud Logging sink to export relevant authentication logs to a Pub/Sub topic for SIEM subscription.

C.

Poll Cloud Logging for authentication events using the gcloud logging read tool.3 Forward the events to the SIEM.

D.

Configure Google Workspace to directly send logs to the API endpoint of the third-party SIEM.4

Question 78

You are implementing a new web application on Google Cloud that will be accessed from your on-premises network. To provide protection from threats like malware, you must implement transport layer security (TLS) interception for incoming traffic to your application. What should you do?​

Options:

A.

Configure Secure Web Proxy. Offload the TLS traffic in the load balancer, inspect the traffic, and forward the traffic to the web application.​

B.

Configure an internal proxy load balancer. Offload the TLS traffic in the load balancer, inspect the traffic, and forward the traffic to the web application.​

C.

Configure a hierarchical firewall policy. Enable TLS interception by using Cloud Next Generation Firewall (NGFW) Enterprise.​

D.

Configure a VPC firewall rule. Enable TLS interception by using Cloud Next Generation Firewall (NGFW) Enterprise.​

Question 79

Your company runs a website that will store PII on Google Cloud Platform. To comply with data privacy regulations, this data can only be stored for a specific amount of time and must be fully deleted after this specific period. Data that has not yet reached the time period should not be deleted. You want to automate the process of complying with this regulation.

What should you do?

Options:

A.

Store the data in a single Persistent Disk, and delete the disk at expiration time.

B.

Store the data in a single BigQuery table and set the appropriate table expiration time.

C.

Store the data in a Cloud Storage bucket, and configure the bucket's Object Lifecycle Management feature.

D.

Store the data in a single BigTable table and set an expiration time on the column families.

Question 80

A company is running workloads in a dedicated server room. They must only be accessed from within the private company network. You need to connect to these workloads from Compute Engine instances within a Google Cloud Platform project.

Which two approaches can you take to meet the requirements? (Choose two.)

Options:

A.

Configure the project with Cloud VPN.

B.

Configure the project with Shared VPC.

C.

Configure the project with Cloud Interconnect.

D.

Configure the project with VPC peering.

E.

Configure all Compute Engine instances with Private Access.

Question 81

Your organization has 3 TB of information in BigQuery and Cloud SQL. You need to develop a cost-effective, scalable, and secure strategy to anonymize the personally identifiable information (PII) that exists today. What should you do?

Options:

A.

Scan your BigQuery and Cloud SQL data using the Cloud DLP data profiling feature. Use the data profiling results to create a de-identification strategy with either Cloud Sensitive Data Protection's de-identification templates or custom configurations.

B.

Create a new BigQuery dataset and Cloud SQL instance. Copy a small subset of the data to these new locations. Use Cloud Data Loss Prevention API to scan this subset for PII. Based on the results, create a custom anonymization script and apply the script to the entire 3 TB dataset in the original locations.

C.

Export all 3TB of data from BigQuery and Cloud SQL to Cloud Storage. Use Cloud Sensitive Data Protection to anonymize the exported data. Re-import the anonymized data back into BigQuery and Cloud SQL.

D.

Inspect a representative sample of the data in BigQuery and Cloud SQL to identify PII. Based on this analysis, develop a custom script to anonymize the identified PII.

Question 82

Your company is moving to Google Cloud. You plan to sync your users first by using Google Cloud Directory Sync (GCDS). Some employees have already created Google Cloud accounts by using their company email addresses that were created outside of GCDS. You must create your users on Cloud Identity.

What should you do?

Options:

A.

Configure GCDS and use GCDS search rules lo sync these users.

B.

Use the transfer tool to migrate unmanaged users.

C.

Write a custom script to identify existing Google Cloud users and call the Admin SDK Directory API to transfer their account.

D.

Configure GCDS and use GCDS exclusion rules to ensure users are not suspended.

Question 83

In an effort for your company messaging app to comply with FIPS 140-2, a decision was made to use GCP compute and network services. The messaging app architecture includes a Managed Instance Group (MIG) that controls a cluster of Compute Engine instances. The instances use Local SSDs for data caching and UDP for instance-to-instance communications. The app development team is willing to make any changes necessary to comply with the standard

Which options should you recommend to meet the requirements?

Options:

A.

Encrypt all cache storage and VM-to-VM communication using the BoringCrypto module.

B.

Set Disk Encryption on the Instance Template used by the MIG to customer-managed key and use BoringSSL for all data transit between instances.

C.

Change the app instance-to-instance communications from UDP to TCP and enable BoringSSL on clients' TLS connections.

D.

Set Disk Encryption on the Instance Template used by the MIG to Google-managed Key and use BoringSSL library on all instance-to-instance communications.

Question 84

Which international compliance standard provides guidelines for information security controls applicable to the provision and use of cloud services?

Options:

A.

ISO 27001

B.

ISO 27002

C.

ISO 27017

D.

ISO 27018

Question 85

Your organization wants to be compliant with the General Data Protection Regulation (GDPR) on Google Cloud You must implement data residency and operational sovereignty in the EU.

What should you do?

Choose 2 answers

Options:

A.

Limit the physical location of a new resource with the Organization Policy Service resource locationsconstraint."

B.

Use Cloud IDS to get east-west and north-south traffic visibility in the EU to monitor intra-VPC and mter-VPC communication.

C.

Limit Google personnel access based on predefined attributes such as their citizenship or geographic location by using Key Access Justifications

D.

Use identity federation to limit access to Google Cloud resources from non-EU entities.

E.

Use VPC Flow Logs to monitor intra-VPC and inter-VPC traffic in the EU.

Question 86

All logs in your organization are aggregated into a centralized Google Cloud logging project for analysis and long-term retention.4 While most of the log data can be viewed by operations teams, there are specific sensitive fields (i.e., protoPayload.authenticationinfo.principalEmail) that contain identifiable information that should be restricted only to security teams. You need to implement a solution that allows different teams to view their respective application logs in the centralized logging project. It must also restrict access to specific sensitive fields within those logs to only a designated security group. Your solution must ensure that other fields in the same log entry remain visible to other authorized groups. What should you do?

Options:

A.

Configure field-level access in Cloud Logging by defining data access policies that specify sensitive fields and the authorized principals.

B.

Use Cloud IAM custom roles with specific permissions on logging.privateLogEntries.list. Define field-level access within the custom role's conditions.

C.

Implement a log sink to exclude sensitive fields before logs are sent to the centralized logging project. Create separate sinks for sensitive data.

D.

Create a BigQuery authorized view on the exported log sink to filter out the sensitive fields based on user groups.

Question 87

You work for a large organization that recently implemented a 100GB Cloud Interconnect connection between your Google Cloud and your on-premises edge router. While routinely checking the connectivity, you noticed that the connection is operational but there is an error message that indicates MACsec is operationally down. You need to resolve this error. What should you do?

Options:

A.

Ensure that the active pre-shared key created for MACsec is not expired on both the on-premises and Google edge routers.

B.

Ensure that the active pre-shared key matches on both the on-premises and Google edge routers.

C.

Ensure that the Cloud Interconnect connection supports MACsec.

D.

Ensure that the on-premises router is not down.

Question 88

A large e-retailer is moving to Google Cloud Platform with its ecommerce website. The company wants to ensure payment information is encrypted between the customer’s browser and GCP when the customers checkout online.

What should they do?

Options:

A.

Configure an SSL Certificate on an L7 Load Balancer and require encryption.

B.

Configure an SSL Certificate on a Network TCP Load Balancer and require encryption.

C.

Configure the firewall to allow inbound traffic on port 443, and block all other inbound traffic.

D.

Configure the firewall to allow outbound traffic on port 443, and block all other outbound traffic.

Question 89

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its current data backup and disaster recovery solutions to GCP for later analysis. The organization’s production environment will remain on- premises for an indefinite time. The organization wants a scalable and cost-efficient solution.

Which GCP solution should the organization use?

Options:

A.

BigQuery using a data pipeline job with continuous updates

B.

Cloud Storage using a scheduled task and gsutil

C.

Compute Engine Virtual Machines using Persistent Disk

D.

Cloud Datastore using regularly scheduled batch upload jobs

Question 90

Your organization is migrating its primary web application from on-premises to Google Kubernetes Engine (GKE). You must advise the development team on how to grant their applications access to Google Cloud services from within GKE according to security recommended practices. What should you do?

Options:

A.

Create an application-specific IAM service account and generate a user-managed service account key for it. Inject the key to the workload by storing it as a Kubernetes secret within the same namespace as the application.

B.

Enable Workload Identity for GKE. Assign a Kubernetes service account to the application and configure that Kubernetes service account to act as an Identity and Access Management (IAM) service account. Grant the required roles to the IAM service account.

C.

Configure the GKE nodes to use the default Compute Engine service account.

D.

Create a user-managed service account with only the roles required for the specific workload. Assign this service account to the GKE nodes.

Question 91

You plan to deploy your cloud infrastructure using a CI/CD cluster hosted on Compute Engine. You want to minimize the risk of its credentials being stolen by a third party. What should you do?

Options:

A.

Create a dedicated Cloud Identity user account for the cluster. Use a strong self-hosted vault solution to store the user's temporary credentials.

B.

Create a dedicated Cloud Identity user account for the cluster. Enable the constraints/iam.disableServiceAccountCreation organization policy at the project level.

C.

Create a custom service account for the cluster Enable the constraints/iam.disableServiceAccountKeyCreation organization policy at the project level.

D.

Create a custom service account for the cluster Enable the constraints/iam.allowServiceAccountCredentialLifetimeExtension organization policy at the project level.

Question 92

A customer’s company has multiple business units. Each business unit operates independently, and each has their own engineering group. Your team wants visibility into all projects created within the company and wants to organize their Google Cloud Platform (GCP) projects based on different business units. Each business unit also requires separate sets of IAM permissions.

Which strategy should you use to meet these needs?

Options:

A.

Create an organization node, and assign folders for each business unit.

B.

Establish standalone projects for each business unit, using gmail.com accounts.

C.

Assign GCP resources in a project, with a label identifying which business unit owns the resource.

D.

Assign GCP resources in a VPC for each business unit to separate network access.

Question 93

Last week, a company deployed a new App Engine application that writes logs to BigQuery. No other workloads are running in the project. You need to validate that all data written to BigQuery was done using the App Engine Default Service Account.

What should you do?

Options:

A.

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.2.Click on the email address in line with the App Engine Default Service Account in the authentication field.3.Click Hide Matching Entries.4.Make sure the resulting list is empty.

B.

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.2.Click on the email address in line with the App Engine Default Service Account in the authentication field.3.Click Show Matching Entries.4.Make sure the resulting list is empty.

C.

1. In BigQuery, select the related dataset.2. Make sure the App Engine Default Service Account is the only account that can write to the dataset.

D.

1. Go to the IAM section on the project.2. Validate that the App Engine Default Service Account is the only account that has a role that can write to BigQuery.

Question 94

You work at a company in a regulated industry and are responsible for ongoing security of the Cloud environment. You need to prevent and detect misconfigurations in a particular folder based on specific compliance policies. You need to adhere to industry-specific compliance policies and policies that are internal to your company. What should you do?

Options:

A.

Enable Assured Workloads on the folder level, with the specific control bundle appropriate for your industry's regulations.

B.

Use Workload Manager with custom Rego policies to continuously scan the environment for misconfigurations on the folder level.C. Create a Posture file by using custom and predefined SHA or organization policies. Enforce the posture on the folder level.

C.

Create custom organization policies that follow specific business requirements. Enforce the policies on the folder level.

Exam Detail
Vendor: Google
Certification: Google Cloud Certified
Last Update: Feb 16, 2026
Professional-Cloud-Security-Engineer Question Answers