Download Data Engineering on Microsoft Azure (beta).dump4pass.DP-203.2022-01-17.3e.78q.vcex

Download Exam

File Info

Exam Data Engineering on Microsoft Azure (beta)
Number DP-203
File Name Data Engineering on Microsoft Azure (beta).dump4pass.DP-203.2022-01-17.3e.78q.vcex
Size 4.99 Mb
Posted January 17, 2022
Downloads 13

How to open VCEX & EXAM Files?

Files with VCEX & EXAM extensions can be opened by ProfExam Simulator.

Purchase
Coupon: EXAM_HUB

Discount: 20%

 
 



Demo Questions

Question 1
You have files and folders in Azure Data Lake Storage Gen2 for an Azure Synapse workspace as shown in the following exhibit.  
  
        
   
You create an external table named ExtTable that has LOCATION='/topfolder/'.    
When you query ExtTable by using an Azure Synapse Analytics serverless SQL pool, which files are returned?

  • A: File2.csv and File3.csv only 
  • B: File1.csv and File4.csv only
  • C: File1.csv, File2.csv, File3.csv, and File4.csv
  • D: File1.csv only



Question 2
You are planning the deployment of Azure Data Lake Storage Gen2.  
You have the following two reports that will access the data lake: 
Report1: Reads three columns from a file that contains 50 columns. 
Report2: Queries a single record based on a timestamp.   
You need to recommend in which format to store the data in the data lake to support the reports. The solution must minimize read times.    
What should you recommend for each report? To answer, select the appropriate options in the answer area.    
NOTE: Each correct selection is worth one point. 




Question 3
You have an Azure Data Lake Storage Gen2 container.    
Data is ingested into the container, and then transformed by a data integration application. The data is NOT modified after that. Users can read files in the container but cannot modify the files.  
You need to design a data archiving solution that meets the following requirements: 
New data is accessed frequently and must be available as quickly as possible.  
Data that is older than five years is accessed infrequently but must be available within one second when requested.  
Data that is older than seven years is NOT accessed. After seven years, the data must be persisted at the lowest cost possible.  
Costs must be minimized while maintaining the required availability.    
How should you manage the data? To answer, select the appropriate options in the answer area.    
NOTE: Each correct selection is worth one point     




Question 4
You have an enterprise-wide Azure Data Lake Storage Gen2 account. The data lake is accessible only through an Azure virtual network named VNET1.    
You are building a SQL pool in Azure Synapse that will use data from the data lake.    
Your company has a sales team. All the members of the sales team are in an Azure Active Directory group named Sales. POSIX controls are used to assign the Sales group access to the files in the data lake.    
You plan to load data to the SQL pool every hour.    
You need to ensure that the SQL pool can load the sales data from the data lake.    
Which three actions should you perform? Each correct answer presents part of the solution.    
NOTE: Each area selection is worth one point.

  • A: Add the managed identity to the Sales group.
  • B: Use the managed identity as the credentials for the data load process.
  • C: Create a shared access signature (SAS). 
  • D: Add your Azure Active Directory (Azure AD) account to the Sales group.
  • E: Use the snared access signature (SAS) as the credentials for the data load process.
  • F: Create a managed identity.



Question 5
You have a SQL pool in Azure Synapse.    
You plan to load data from Azure Blob storage to a staging table. Approximately 1 million rows of data will be loaded daily. The table will be truncated before each daily load.    
You need to create the staging table. The solution must minimize how long it takes to load the data to the staging table.    
How should you configure the table? To answer, select the appropriate options in the answer area.    
NOTE: Each correct selection is worth one point.    




Question 6
From a website analytics system, you receive data extracts about user interactions such as downloads, link clicks, form submissions, and video plays.    
The data contains the following columns.  
  
        
  
You need to design a star schema to support analytical queries of the data. The star schema will contain four tables including a date dimension.    
To which table should you add each column? To answer, select the appropriate options in the answer area.    
NOTE: Each correct selection is worth one point. 




Question 7
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.    
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.    
You have an Azure Storage account that contains 100 GB of files. The files contain rows of text and numerical values. 75% of the rows contain description data that has an average length of 1.1 MB.    
You plan to copy the data from the storage account to an enterprise data warehouse in Azure Synapse Analytics.    
You need to prepare the files to ensure that the data copies quickly.  
Solution: You copy the files to a table that has a columnstore index. 
Does this meet the goal?

  • A: Yes
  • B: No



Question 8
You are creating an Azure Data Factory data flow that will ingest data from a CSV file, cast columns to specified types of data, and insert the data into a table in an Azure Synapse Analytic dedicated SQL pool. The CSV file contains three columns named username, comment, and date.    
The data flow already contains the following:   
A source transformation.  
A Derived Column transformation to set the appropriate types of data.  
A sink transformation to land the data in the pool.    
You need to ensure that the data flow meets the following requirements:   
All valid rows must be written to the destination table.  
Truncation errors in the comment column must be avoided proactively.  
Any rows containing comment values that will cause truncation errors upon insert must be written to a file in blob storage.    
Which two actions should you perform? Each correct answer presents part of the solution.    
NOTE: Each correct selection is worth one point.

  • A: To the data flow, add a sink transformation to write the rows to a file in blob storage.
  • B: To the data flow, add a Conditional Split transformation to separate the rows that will cause truncation errors.
  • C: To the data flow, add a filter transformation to filter out rows that will cause truncation errors.
  • D: Add a select transformation to select only the rows that will cause truncation errors. 



Question 9
You have two Azure Data Factory instances named ADFdev and ADFprod. ADFdev connects to an Azure DevOps Git repository.    
You publish changes from the main branch of the Git repository to ADFdev.  
You need to deploy the artifacts from ADFdev to ADFprod.  
What should you do first?

  • A: From ADFdev, modify the Git configuration.
  • B: From ADFdev, create a linked service. 
  • C: From Azure DevOps, create a release pipeline.
  • D: From Azure DevOps, update the main branch.



Question 10
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.  
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.    
You are designing an Azure Stream Analytics solution that will analyze Twitter data.    
You need to count the tweets in each 10-second window. The solution must ensure that each tweet is counted only once.    
Solution: You use a tumbling window, and you set the window size to 10 seconds.   
Does this meet the goal?

  • A: Yes
  • B: No






CONNECT US


ProfExam
PROFEXAM WITH A 20% DISCOUNT

You can buy ProfExam with a 20% discount..

Get Now!


HOW TO OPEN VCEX AND EXAM FILES

Use ProfExam Simulator to open VCEX and EXAM files
ProfExam Screen



HOW TO OPEN VCE FILES

Use VCE Exam Simulator to open VCE files
Avanaset