Download Designing and Implementing Cloud-Native Applications Using Microsoft Azure Cosmos DB (beta).CertDumps.DP-420.2022-08-26.1e.22q.vcex

Download Exam

File Info

Exam Designing and Implementing Cloud-Native Applications Using Microsoft Azure Cosmos DB (beta)
Number DP-420
File Name Designing and Implementing Cloud-Native Applications Using Microsoft Azure Cosmos DB (beta).CertDumps.DP-420.2022-08-26.1e.22q.vcex
Size 1.15 Mb
Posted August 26, 2022
Downloads 13

How to open VCEX & EXAM Files?

Files with VCEX & EXAM extensions can be opened by ProfExam Simulator.

Purchase
Coupon: EXAM_HUB

Discount: 20%

 
 



Demo Questions

Question 1
You have a container named container1 in an Azure Cosmos DB Core (SQL) API account. You need to make the contents of container1 available as reference data for an Azure Stream Analytics job.  
Solution: You create an Azure Data Factory pipeline that uses Azure Cosmos DB Core (SQL) API as the input and Azure Blob Storage as the output.  
Does this meet the goal?

  • A: Yes
  • B: No



Question 2
You plan to create an Azure Cosmos DB Core (SQL) API account that will use customer-managed keys stored in Azure Key Vault. You need to configure an access policy in Key Vault to allow Azure Cosmos DB access to the keys. Which three permissions should you enable in the access policy?  
(Each correct answer presents part of the solution. Choose three.)

  • A: Wrap Key
  • B: Get
  • C: List
  • D: Update
  • E: Sign
  • F: Verify
  • G: Unwrap Key



Question 3
You are troubleshooting the current issues caused by the application updates. Which action can address the application updates issue without affecting the functionality of the application?

  • A: Enable time to live for the con-product container.
  • B: Set the default consistency level of account1 to strong.
  • C: Set the default consistency level of account1 to bounded staleness.
  • D: Add a custom indexing policy to the con-product container.



Question 4
You need to implement a trigger in Azure Cosmos DB Core (SQL) API that will run before an item is inserted into a container. Which two actions should you perform to ensure that the trigger runs?  
(Each correct answer presents part of the solution. Choose two.)

  • A: Append pre to the name of the JavaScript function trigger.
  • B: For each create request, set the access condition in RequestOptions.
  • C: Register the trigger as a pre-trigger.
  • D: For each create request, set the consistency level to session in RequestOptions.
  • E: For each create request, set the trigger name in RequestOptions.



Question 5
You have an application named App1 that reads the data in an Azure Cosmos DB Core (SQL) API account. App1 runs the same read queries every minute. The default consistency level for the account is set to eventual. You discover that every query consumes request units (RUs) instead of using the cache. You verify the IntegratedCacheiteItemHitRate metric and the IntegratedCacheQueryHitRate metric. Both metrics have values of 0. You verify that the dedicated gateway cluster is provisioned and used in the connection string. You need to ensure that App1 uses the Azure Cosmos DB integrated cache. What should you configure?

  • A: the indexing policy of the Azure Cosmos DB container
  • B: the consistency level of the requests from App1
  • C: the connectivity mode of the App1 CosmosClient
  • D: the default consistency level of the Azure Cosmos DB account



Question 6
You need to select the partition key for con-iot1. The solution must meet the IoT telemetry requirements. What should you select?  

  • A: the timestamp
  • B: the humidity
  • C: the temperature
  • D: the device ID



Question 7
You have an Azure Cosmos DB Core (SQL) API account that is used by 10 web apps. You need to analyze the data stored in the account by using Apache Spark to create machine learning models.  
The solution must NOT affect the performance of the web apps. Which two actions should you perform? (Each correct answer presents part of the solution. Choose two.)

  • A: In an Apache Spark pool in Azure Synapse, create a table that uses cosmos.olap as the data source.
  • B: Create a private endpoint connection to the account.
  • C: In an Azure Synapse Analytics serverless SQL pool, create a view that uses OPENROWSET and the CosmosDB provider.
  • D: Enable Azure Synapse Link for the account and Analytical store on the container.
  • E: In an Apache Spark pool in Azure Synapse, create a table that uses cosmos.oltp as the data source.



Question 8
You are implementing an Azure Data Factory data flow that will use an Azure Cosmos DB (SQL API) sink to write a dataset. The data flow will use 2,000 Apache Spark partitions. You need to ensure that the ingestion from each Spark partition is balanced to optimize throughput. Which sink setting should you configure?

  • A: Throughput.
  • B: Write throughput budget.
  • C: Batch size.
  • D: Collection action.



Question 9
You need to identify which connectivity mode to use when implementing App2. The solution must support the planned changes and meet the business requirements. Which connectivity mode should you identify?

  • A: Direct mode over HTTPS
  • B: Gateway mode (using HTTPS)
  • C: Direct mode over TCP



Question 10
You configure multi-region writes for account1. You need to ensure that App1 supports the new configuration for account1. The solution must meet the business requirements and the product catalog requirements. What should you do?

  • A: Set the default consistency level of accountl to bounded staleness.
  • B: Create a private endpoint connection.
  • C: Modify the connection policy of App1.
  • D: Increase the number of request units per second (RU/s) allocated to the con-product and con productVendor containers.






CONNECT US


ProfExam
PROFEXAM WITH A 20% DISCOUNT

You can buy ProfExam with a 20% discount..

Get Now!


HOW TO OPEN VCEX AND EXAM FILES

Use ProfExam Simulator to open VCEX and EXAM files
ProfExam Screen



HOW TO OPEN VCE FILES

Use VCE Exam Simulator to open VCE files
Avanaset