Big Halloween Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Google Cloud Certified Security-Operations-Engineer Google Study Notes

Google Cloud Certified - Professional Security Operations Engineer (PSOE) Exam Questions and Answers

Question 9

You are conducting proactive threat hunting in your company's Google Cloud environment. You suspect that an attacker compromised a developer's credentials and is attempting to move laterally from a development Google Kubernetes Engine (GKE) cluster to critical production systems. You need to identify IoCs and prioritize investigative actions by using Google Cloud's security tools before analyzing raw logs in detail. What should you do next?

Options:

A.

In the Security Command Center (SCC) console, apply filters for the cluster and analyze the resulting aggregated findings' timeline and details for IoCs. Examine the attack path simulations associated with attack exposure scores to prioritize subsequent actions.

B.

Review threat intelligence feeds within Google Security Operations (SecOps), and enrich any anomalies with context on known IoCs, attacker tactics, techniques, and procedures (TTPs), and campaigns.

C.

Investigate Virtual Machine (VM) Threat Detection findings in Security Command Center (SCC). Filter for VM Threat Detection findings to target the Compute Engine instances that serve as the nodes for the cluster, and look for malware or rootkits on the nodes.

D.

Create a Google SecOps SOAR playbook that automatically isolates any GKE resources exhibiting unusual network connections to production environments and triggers an alert to the incident response team.

Question 10

You are implementing Google Security Operations (SecOps) for your organization. Your organization has their own threat intelligence feed that has been ingested to Google SecOps by using a native integration with a Malware Information Sharing Platform (MISP). You are working on the following detection rule to leverage the command and control (C2) indicators that were ingested into the entity graph.

What code should you add in the detection rule to filter for the domain IOCS?

Options:

A.

$ioc.graph.metadata.entity_type = MDOMAlN_NAME"

$ioc.graph.metadata.scurce_type = "ElfelTYj^ONTEXT"

B.

$ioc.graph.metadata.entity_type = "DOMAlN_NAME"

Sioc.graph.metadata.source_type = "GLOBAL_CONTEXT"

C.

$ioc.graph.metadata.entity_type = "D0MAIN_NAME"

$ioc.graph.metadata.source_type = MDERIVED_CONTEXT"

D.

$ioc.graph.metadata.entity_type = ,'D0MAIN_NAME*'

$ioc.graph.metadata.source type = "source type unspecified"

Question 11

You scheduled a Google Security Operations (SecOps) report to export results to a BigQuery dataset in your Google Cloud project. The report executes successfully in Google SecOps, but no data appears in the dataset. You confirmed that the dataset exists. How should you address this export failure?

Options:

A.

Grant the Google SecOps service account the roles/iam.serviceAccountUser IAM role to itself.

B.

Set a retention period for the BigQuery export.

C.

Grant the user account that scheduled the report the roles/bigquery.dataEditor IAM role on the project.

D.

Grant the Google SecOps service account the roles/bigquery.dataEditor IAM role on the dataset.

Question 12

Your organization's Google Security Operations (SecOps) tenant is ingesting a vendor's firewall logs in its default JSON format using the Google-provided parser for that log. The vendor recently released a patch that introduces a new field and renames an existing field in the logs. The parser does not recognize these two fields and they remain available only in the raw logs, while the rest of the log is parsed normally. You need to resolve this logging issue as soon as possible while minimizing the overall change management impact. What should you do?

Options:

A.

Use the web interface-based custom parser feature in Google SecOps to copy the parser, and modify it to map both fields to UDM.

B.

Use the Extract Additional Fields tool in Google SecOps to convert the raw log entries to additional fields.

C.

Deploy a third-party data pipeline management tool to ingest the logs, and transform the updated fields into fields supported by the default parser.

D.

Write a code snippet, and deploy it in a parser extension to map both fields to UDM.