Weekend Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Associate-Data-Practitioner Exam Questions Tutorials

Google Cloud Associate Data Practitioner (ADP Exam) Questions and Answers

Question 21

You have a BigQuery dataset containing sales data. This data is actively queried for the first 6 months. After that, the data is not queried but needs to be retained for 3 years for compliance reasons. You need to implement a data management strategy that meets access and compliance requirements, while keeping cost and administrative overhead to a minimum. What should you do?

Options:

A.

Use BigQuery long-term storage for the entire dataset. Set up a Cloud Run function to delete the data from BigQuery after 3 years.

B.

Partition a BigQuery table by month. After 6 months, export the data to Coldline storage. Implement a lifecycle policy to delete the data from Cloud Storage after 3 years.

C.

Set up a scheduled query to export the data to Cloud Storage after 6 months. Write a stored procedure to delete the data from BigQuery after 3 years.

D.

Store all data in a single BigQuery table without partitioning or lifecycle policies.

Question 22

Your organization uses scheduled queries to perform transformations on data stored in BigQuery. You discover that one of your scheduled queries has failed. You need to troubleshoot the issue as quickly as possible. What should you do?

Options:

A.

Navigate to the Logs Explorer page in Cloud Logging. Use filters to find the failed job, and analyze the error details.

B.

Set up a log sink using the gcloud CLI to export BigQuery audit logs to BigQuery. Query those logs to identify the error associated with the failed job ID.

C.

Request access from your admin to the BigQuery information_schema. Query the jobs view with the failed job ID, and analyze error details.

D.

Navigate to the Scheduled queries page in the Google Cloud console. Select the failed job, and analyze the error details.

Question 23

You are responsible for managing Cloud Storage buckets for a research company. Your company has well-defined data tiering and retention rules. You need to optimize storage costs while achieving your data retention needs. What should you do?

Options:

A.

Configure the buckets to use the Archive storage class.

B.

Configure a lifecycle management policy on each bucket to downgrade the storage class and remove objects based on age.

C.

Configure the buckets to use the Standard storage class and enable Object Versioning.

D.

Configure the buckets to use the Autoclass feature.

Question 24

You are using your own data to demonstrate the capabilities of BigQuery to your organization’s leadership team. You need to perform a one-time load of the files stored on your local machine into BigQuery using as little effort as possible. What should you do?

Options:

A.

Write and execute a Python script using the BigQuery Storage Write API library.

B.

Create a Dataproc cluster, copy the files to Cloud Storage, and write an Apache Spark job using the spark-bigquery-connector.

C.

Execute the bq load command on your local machine.

D.

Create a Dataflow job using the Apache Beam FileIO and BigQueryIO connectors with a local runner.