🧠 Beginner Level (1–10)
- What is Azure Data Factory used for?
A) Machine Learning
B) Big Data Storage
C) Data Integration and ETL
D) Website Hosting - Which of the following is not a core component of ADF?
A) Pipeline
B) Dataset
C) Notebook
D) Linked Service - What type of service is Azure Data Factory?
A) SaaS
B) IaaS
C) PaaS
D) FaaS - Which ADF object defines the source or destination data structure?
A) Linked Service
B) Dataset
C) Pipeline
D) Trigger - True or False: You can run Python scripts directly in Azure Data Factory.
False - What does an ADF pipeline contain?
A) Tables
B) Activities
C) Servers
D) Repositories - Which ADF activity is used to move data from one place to another?
A) Copy Activity
B) Execute Pipeline
C) Get Metadata
D) Lookup - Which of these is a control activity in ADF?
A) Stored Procedure
B) Copy
C) If Condition
D) Data Flow - What is the purpose of a Linked Service?
A) Run a VM
B) Define data format
C) Define connection details
D) Automate pipelines - True or False: Datasets in ADF are always required to run a pipeline.
False
⚙️ Intermediate Level (11–20)
- Which activity would you use to perform transformations on large datasets visually?
A) Stored Procedure
B) Mapping Data Flow
C) Copy Activity
D) Lookup Activity - What language is used in ADF to create dynamic expressions?
A) JavaScript
B) PowerShell
C) Data Flow Script
D) Data Factory Expression Language - Triggers in ADF can be used to:
A) Send emails
B) Monitor costs
C) Schedule or respond to events
D) Scale VMs - Which trigger type is used for event-based pipelines?
A) Tumbling Window
B) Event Grid
C) Schedule
D) Manual - ADF integration runtime (IR) is required to:
A) Schedule triggers
B) Connect to Git
C) Perform data movement and compute
D) Monitor pipelines - Which Integration Runtime is best for accessing on-premises data?
A) Azure IR
B) Self-Hosted IR
C) Auto IR
D) SSIS IR - Can ADF connect to both structured and unstructured data sources?
A) Yes
B) No - What is the purpose of the Lookup activity?
A) Perform joins
B) Run SQL queries and return data
C) Update tables
D) Trigger notifications - What is parameterization in ADF?
A) Encrypting data
B) Running SQL commands
C) Passing dynamic values into pipelines or datasets
D) Automating backups - How does ADF handle retries when an activity fails?
A) No retries
B) Automatic infinite retries
C) Retries can be configured per activity
D) Only allowed in Data Flows
🧪 Advanced Level (21–25)
- Which ADF feature allows integration with Git repositories?
A) Data Flows
B) Integration Runtime
C) Git Configuration pane
D) Pipeline Trigger - What is a debug run in ADF?
A) Production test
B) A test run that doesn’t publish to the live environment
C) A run that optimizes performance
D) A dry run that deletes data - Which ADF feature is best for building CI/CD pipelines?
A) Integration Runtime
B) Azure DevOps or GitHub Actions
C) Data Flow Debug
D) Triggers - True or False: You can use parameters and expressions in Mapping Data Flows.
True - What’s the best way to monitor pipeline executions in ADF?
A) Azure Portal Monitoring tab
B) SQL Profiler
C) Excel
D) Kusto Query Language (KQL)






