Download MuleSoft Certified Integration Architect - Level 1.MCIA-Level-1.CertDumps.2022-11-11.101q.vcex

Vendor: Mulesoft
Exam Code: MCIA-Level-1
Exam Name: MuleSoft Certified Integration Architect - Level 1
Date: Nov 11, 2022
File Size: 5 MB

How to open VCEX files?

Files with VCEX extension can be opened by ProfExam Simulator.

Purchase
Coupon: EXAM_HUB

Discount: 20%

Demo Questions

Question 1
A global organization operates datacenters in many countries. There are private network links between these datacenters because all business data (but NOT metadata) must be exchanged over these private network connections.
The organization does not currently use AWS in any way.
The strategic decision has just been made to rigorously minimize IT operations effort and investment going forward.
What combination of deployment options of the Anypoint Platform control plane and runtime plane(s) best serves this organization at the start of this strategic journey?
  1. MuleSoft-hosted Anypoint Platform control planeCloudHub Shared Worker Cloud in multiple AWS regions
  2. MuleSoft-hosted Anypoint Platform control planeCustomer-hosted runtime plane in multiple AWS regions
  3. MuleSoft-hosted Anypoint Platform control planeCustomer-hosted runtime plane in each datacenter
  4. Anypoint Platform - Private Cloud EditionCustomer-hosted runtime plane in each datacenter
Correct answer: B
Question 2
Anypoint Exchange is required to maintain the source code of some of the assets committed to it, such as Connectors, Templates, and API specifications.
What is the best way to use an organization's source-code management (SCM) system in this context?
  1. Organizations need to point Anypoint Exchange to their SCM system so Anypoint Exchange can pullsource code when requested by developers and provide it to Anypoint Studio
  2. Organizations need to use Anypoint Exchange as the main SCM system to centralize versioning and avoidcode duplication
  3. Organizations can continue to use an SCM system of their choice for branching and merging, as long asthey follow the branching and merging strategy enforced by Anypoint Exchange
  4. Organizations should continue to use an SCM system of their choice, in addition to keeping source codefor these asset types in Anypoint Exchange, thereby enabling parallel development, branching, and
    merging
Correct answer: B
Question 3
An organization is designing an integration solution to replicate financial transaction data from a legacy system into a data warehouse (DWH).
The DWH must contain a daily snapshot of financial transactions, to be delivered as a CSV file. Daily transaction volume exceeds tens of millions of records, with significant spikes in volume during popular shopping periods.
What is the most appropriate integration style for an integration solution that meets the organization's current requirements?
  1. API-led connectivity
  2. Batch-triggered ETL
  3. Event-driven architecture
  4. Microservice architecture
Correct answer: D
Question 4
A set of integration Mule applications, some of which expose APIs, are being created to enable a new business process. Various stakeholders may be impacted by this. These stakeholders are a combination of semi-technical users (who understand basic integration terminology and concepts such as JSON and XML) and technically skilled potential consumers of the Mule applications and APIs.
What is an effective way for the project team responsible for the Mule applications and APIs being built to communicate with these stakeholders using Anypoint Platform and its supplied toolset?
  1. Create Anypoint Exchange entries with pages elaborating the integration design, including API notebooks(where applicable) to help the stakeholders understand and interact with the Mule applications and APIs at
    various levels of technical depth
  2. Capture documentation about the Mule applications and APIs inline within the Mule integration flows anduse Anypoint Studio's Export Documentation feature to provide an HTML version of this documentation to
    the stakeholders
  3. Use Anypoint Design Center to implement the Mule applications and APIs and give the variousstakeholders access to these Design Center projects, so they can collaborate and provide feedback
  4. Use Anypoint Exchange to register the various Mule applications and APIs and share the RAML definitionswith the stakeholders, so they can be discovered
Correct answer: D
Question 5
A Mule application is being designed to do the following:
Step 1: Read a SalesOrder message from a JMS queue, where each SalesOrder consists of a header and a list of SalesOrderLineltems.
Step 2: Insert the SalesOrder header and each SalesOrderLineItem into different tables in an RDBMS.
Step 3: Insert the SalesOrder header and the sum of the prices of all its SalesOrderLineltems into a table in a different RDBMS.
No SalesOrder message can be lost and the consistency of all SalesOrder-related information in both RDBMSs must be ensured at all times.
What design choice (including choice of transactions) and order of steps addresses these requirements?
  1. 1. Read the JMS message (NOT in an XA transaction)
    2. Perform EACH DB insert in a SEPARATE DB transaction
    3. Acknowledge the JMS message
  2. 1. Read and acknowledge the JMS message (NOT in an XA transaction) 
    2. In a NEW XA transaction, perform BOTH DB inserts
  3. 1. Read the JMS message in an XA transaction 
    2. In the SAME XA transaction, perform BOTH DB inserts but do NOT acknowledge the JMS message
  4. 1. Read the JMS message (NOT in an XA transaction) 
    2. Perform BOTH DB inserts in ONE DB transaction
    3. Acknowledge the JMS message
Correct answer: C
Question 6
Refer to the exhibit. A Mule application is being designed to be deployed to several CloudHub workers. The Mule application's integration logic is to replicate changed Accounts from Salesforce to a backend system every 5 minutes.
A watermark will be used to only retrieve those Salesforce Accounts that have been modified since the last time the integration logic ran.
What is the most appropriate way to implement persistence for the watermark in order to support the required data replication integration logic?
    
  1. Persistent Object Store
  2. Persistent Cache Scope
  3. Persistent Anypoint MQ Queue
  4. Persistent VM Queue
Correct answer: A
Question 7
Refer to the exhibit. A shopping cart checkout process consists of a web store backend sending a sequence of API invocations to an Experience API, which in turn invokes a Process API. All API invocations are over HTTPS POST. The Java web store backend executes in a Java EE application server, while all API implementations are Mule applications executing in a customer-hosted Mule runtime.
End-to-end correlation of all HTTP requests and responses belonging to each individual checkout instance is required. This is to be done through a common correlation ID, so that all log entries written by the web store backend, Experience API implementation, and Process API implementation include the same correlation ID for all requests and responses belonging to the same checkout instance.
What is the most efficient way (using the least amount of custom coding or configuration) for the web store backend and the implementations of the Experience API and Process API to participate in end-to-end correlation of the API invocations for each checkout instance?
    
  1. The Experience API implementation generates a correlation ID for each incoming HTTP request and passesit to the web store backend in the HTTP response, which includes it in all subsequent API invocations to the Experience API
    The Experience API implementation must be coded to also propagate the correlation ID to the Process API in a suitable HTTP request header
        
  2. The web store backend generates a new correlation ID value at the start of checkout and sets it on theXCORRELATION-ID HTTP request header in each API invocation belonging to that checkout
    No special code or configuration is included in the Experience API and Process API implementations to generate and manage the correlation ID
        
  3. The web store backend, being a Java EE application, automatically makes use of the thread-local correlationID generated by the Java EE application server and automatically transmits that to the Experience API using HTTP-standard headers
    No special code or configuration is included in the web store backend, Experience API, and Process API implementations to generate and manage the correlation ID
        
  4. The web store backend sends a correlation ID value in the HTTP request body in the way required by theExperience API
    The Experience API and Process API implementations must be coded to receive the custom correlation ID in the HTTP requests and propagate it in suitable HTTP request headers
        
Correct answer: B
Question 8
Mule application A receives a request Anypoint MQ message REQU with a payload containing a variablelength list of request objects. Application A uses the For Each scope to split the list into individual objects and sends each object as a message to an Anypoint MQ queue.
Service S listens on that queue, processes each message independently of all other messages, and sends a response message to a response queue.
Application A listens on that response queue and must, in turn, create and publish a response Anypoint MQ message RESP with a payload containing the list of responses sent by service S in the same order as the request objects originally sent in REQU.
Assume successful response messages are returned by service S for all request messages.
What is required so that application A can ensure that the length and order of the list of objects in RESP and REQU match, while at the same time maximizing message throughput?
    
  1. Perform all communication involving service S synchronously from within the For Each scope, so objects inRESP are in the exact same order as request objects in REQU
  2. Use a Scatter-Gather within the For Each scope to ensure response message orderConfigure the Scatter-Gather with a persistent object store
  3. Keep track of the list length and all object indices in REQU, both in the For Each scope and in allcommunication involving service. Use persistent storage when creating RESP
  4. Use an Async scope within the For Each scope and collect response messages in a second For Eachscope in the order in which they arrive, then send RESP using this list of responses
Correct answer: B
Question 9
Refer to the exhibit. A Mule application is deployed to a cluster of two customer-hosted Mule runtimes. The Mule application has a flow that polls a database and another flow with an HTTP Listener.
HTTP clients send HTTP requests directly to individual cluster nodes.
What happens to database polling and HTTP request handling in the time after the primary (master) node of the cluster has failed, but before that node is restarted?
  1. Database polling stopsAll HTTP requests are rejected
  2. Database polling stopsAll HTTP requests continue to be accepted
  3. Database polling continuesOnly HTTP requests sent to the remaining node continue to be accepted
  4. Database polling continuesAll HTTP requests continue to be accepted, but requests to the failed node incur increased latency
Correct answer: A
Question 10
What aspects of a CI/CD pipeline for Mule applications can be automated using MuleSoft-provided Maven plugins?
  1. Import from API designer, compile, package, unit test, deploy, publish to Anypoint Exchange
  2. Compile, package, unit test, validate unit test coverage, deploy
  3. Compile, package, unit test, deploy, integration test
  4. Compile, package, unit test, deploy, create associated API instances in API Manager
Correct answer: C
Explanation:
Reference: http://workshop.tools.mulesoft.com/modules/module7_lab4#step-2-configure-the-mule-mavenplugin     
Reference: http://workshop.tools.mulesoft.com/modules/module7_lab4#step-2-configure-the-mule-mavenplugin
    
HOW TO OPEN VCE FILES

Use VCE Exam Simulator to open VCE files
Avanaset

HOW TO OPEN VCEX AND EXAM FILES

Use ProfExam Simulator to open VCEX and EXAM files
ProfExam Screen

ProfExam
ProfExam at a 20% markdown

You have the opportunity to purchase ProfExam at a 20% reduced price

Get Now!