Spring Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Professional-Cloud-Security-Engineer Premium Exam Questions

Google Cloud Certified - Professional Cloud Security Engineer Questions and Answers

Question 41

Your Google Cloud organization allows for administrative capabilities to be distributed to each team through provision of a Google Cloud project with Owner role (roles/ owner). The organization contains thousands of Google Cloud Projects Security Command Center Premium has surfaced multiple cpen_myscl_port findings. You are enforcing the guardrails and need to prevent these types of common misconfigurations.

What should you do?

Options:

A.

Create a firewall rule for each virtual private cloud (VPC) to deny traffic from 0 0 0 0/0 with priority 0.

B.

Create a hierarchical firewall policy configured at the organization to deny all connections from 0 0 0 0/0.

C.

Create a Google Cloud Armor security policy to deny traffic from 0 0 0 0/0.

D.

Create a hierarchical firewall policy configured at the organization to allow connections only from internal IP ranges

Question 42

Your company is migrating a customer database that contains personally identifiable information (PII) to Google Cloud. To prevent accidental exposure, this data must be protected at rest. You need to ensure that all PII is automatically discovered and redacted, or pseudonymized, before any type of analysis. What should you do?

Options:

A.

Implement Cloud Armor to protect the database from external threats and configure firewall rules to restrict network access to only authorized internal IP addresses.

B.

Configure Sensitive Data Protection to scan the database for PII using both predefined and custom infoTypes and to mask sensitive data.8

C.

Use Cloud KMS to encrypt the database at rest with a customer-managed encryption key (CMEK). Implement VPC Service Controls.

D.

Create Cloud Storage buckets with object versioning enabled, and use IAM policies to restrict access to the data. Use Data Loss Prevention API (DLP API) on the buckets to scan for sensitive data and generate detection alerts.9

Question 43

Your organization is worried about recent news headlines regarding application vulnerabilities in production applications that have led to security breaches. You want to automatically scan your deployment pipeline for vulnerabilities and ensure only scanned and verified containers can run in the environment. What should you do?

Options:

A.

Enable Binary Authorization and create attestations of scans.

B.

Use gcloud artifacts docker images describe LOCATION-docker.pkg.dev/PROJECT_ID/REPOSITORY/IMAGE_ID@sha256:HASH --show-package-vulnerability in your CI/CD pipeline, and trigger a pipeline failure for critical vulnerabilities.

C.

Use Kubernetes role-based access control (RBAC) as the source of truth for cluster access by granting "container clusters.get" to limited users. Restrict deployment access by allowing these users to generate a kubeconfig file containing the configuration access to the GKE cluster.

D.

Enforce the use of Cloud Code for development so users receive real-time security feedback on vulnerable libraries and dependencies before they check in their code.

Question 44

Your company's storage team manages all product images within a specific Google Cloud project. To maintain control, you must isolate access to Cloud Storage for this project, allowing the storage team to manage restrictions at the project level. They must be restricted to using corporate computers. What should you do?

Options:

A.

Employ organization-level firewall rules to block all traffic to Cloud Storage. Create exceptions for specific service accounts used by the storage team within their project.

B.

Implement VPC Service Controls by establishing an organization-wide service perimeter with all projects. Configure ingress and egress rules to restrict access to Cloud Storage based on IP address ranges.

C.

Use Context-Aware Access. Create an access level that defines the required context. Apply it as an organization policy specifically at the project level, restricting access to Cloud Storage based on that context.

D.

Use Identity and Access Management (IAM) roles at the project level within the storage team's project. Grant the storage team granular permissions on the project's Cloud Storage resources.