Data Factory Intermediate Welcome to your Data Factory Intermediate Quiz 1. Which activity in Azure Data Factory is used for executing SQL scripts against data stored in Azure SQL Database or Data Warehouse? Copy Activity Data Flow Lookup Activity Stored Procedure Activity None 2. In Azure Data Factory, what does the Copy Activity do? Creates a copy of the entire data factory Copies data from one data store to another Creates a copy of an Azure virtual machine Copies data from on-premises servers to the cloud None 3. What is the purpose of the Azure Data Factory Monitor & Manage tool? To design data pipelines To create linked services To monitor and manage data factory activities and pipelines To build data flows None 4. In Azure Data Factory, what is the primary purpose of a Dataset? To define the schedule of data pipelines To store data To specify data schema and format To create virtual networks None 5. What is the Azure Data Factory integration runtime used for? To create data pipelines To monitor and manage data factory activities To specify data schema To move data between on-premises and cloud data stores None 6. Which Azure service is commonly used to store big data and is often integrated with Azure Data Factory for data storage? Azure Data Lake Storage Azure Cosmos DB Azure SQL Database Azure Blob Storage None 7. In Azure Data Factory, what is the purpose of a linked service with a type of "Azure Blob Storage"? To create virtual networks To connect to on-premises data sources To connect to Azure Blob Storage accounts To design data pipelines None 8. What is the primary function of the Azure Data Factory Copy Wizard? To create data pipelines To copy data between Azure Data Factories To design data flows To assist in configuring data copy activities None 9. In Azure Data Factory, what is a Trigger used for? To trigger data analysis tasks To define the data schema To schedule and automate data pipeline executions To create linked services None 10. What is the primary role of the Azure Data Factory Data Flow activity? To schedule data extraction tasks To build data transformation logic using data wrangling To create linked services To monitor data pipeline executions None 1 out of 10 Please fill out the form to view your results! We appreciate your participation in the Knowledge Challenge. Please complete the form to view your results. A copy will also be sent to your email shortly after submission. Name Email Phone Time's up