New Year Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Salesforce MuleSoft Changed MuleSoft-Platform-Architect-I Questions

Salesforce Certified MuleSoft Platform Architect (Mule-Arch-201) Questions and Answers

Question 33

Refer to the exhibit.

What is the best way to decompose one end-to-end business process into a collaboration of Experience, Process, and System APIs?

A) Handle customizations for the end-user application at the Process API level rather than the Experience API level

B) Allow System APIs to return data that is NOT currently required by the identified Process or Experience APIs

C) Always use a tiered approach by creating exactly one API for each of the 3 layers (Experience, Process and System APIs)

D) Use a Process API to orchestrate calls to multiple System APIs, but NOT to other Process APIs

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Question 34

What condition requires using a CloudHub Dedicated Load Balancer?

Options:

A.

When cross-region load balancing is required between separate deployments of the same Mule application

B.

When custom DNS names are required for API implementations deployed to customer-hosted Mule runtimes

C.

When API invocations across multiple CloudHub workers must be load balanced

D.

When server-side load-balanced TLS mutual authentication is required between API implementations and API clients

Question 35

What is true about where an API policy is defined in Anypoint Platform and how it is then applied to API instances?

Options:

A.

The API policy Is defined In Runtime Manager as part of the API deployment to a Mule runtime, and then ONLY applied to the specific API Instance

B.

The API policy Is defined In API Manager for a specific API Instance, and then ONLY applied to the specific API instance

C.

The API policy Is defined in API Manager and then automatically applied to ALL API instances

D.

The API policy is defined in API Manager, and then applied to ALL API instances in the specified environment

Question 36

A Rate Limiting policy is applied to an API implementation to protect the back-end system. Recently, there have been surges in demand that cause some API client

POST requests to the API implementation to be rejected with policy-related errors, causing delays and complications to the API clients.

How should the API policies that are applied to the API implementation be changed to reduce the frequency of errors returned to API clients, while still protecting the back-end

system?

Options:

A.

Keep the Rate Limiting policy and add 9 Client ID Enforcement policy

B.

Remove the Rate Limiting policy and add an HTTP Caching policy

C.

Remove the Rate Limiting policy and add a Spike Control policy

D.

Keep the Rate Limiting policy and add an SLA-based Spike Control policy