🧠 Beginner Level (1–10)

  1. What is Azure Data Factory used for?
    A) Machine Learning
    B) Big Data Storage
    C) Data Integration and ETL
    D) Website Hosting
  2. Which of the following is not a core component of ADF?
    A) Pipeline
    B) Dataset
    C) Notebook
    D) Linked Service
  3. What type of service is Azure Data Factory?
    A) SaaS
    B) IaaS
    C) PaaS
    D) FaaS
  4. Which ADF object defines the source or destination data structure?
    A) Linked Service
    B) Dataset
    C) Pipeline
    D) Trigger
  5. True or False: You can run Python scripts directly in Azure Data Factory.
    False
  6. What does an ADF pipeline contain?
    A) Tables
    B) Activities
    C) Servers
    D) Repositories
  7. Which ADF activity is used to move data from one place to another?
    A) Copy Activity
    B) Execute Pipeline
    C) Get Metadata
    D) Lookup
  8. Which of these is a control activity in ADF?
    A) Stored Procedure
    B) Copy
    C) If Condition
    D) Data Flow
  9. What is the purpose of a Linked Service?
    A) Run a VM
    B) Define data format
    C) Define connection details
    D) Automate pipelines
  10. True or False: Datasets in ADF are always required to run a pipeline.
    False

⚙️ Intermediate Level (11–20)

  1. Which activity would you use to perform transformations on large datasets visually?
    A) Stored Procedure
    B) Mapping Data Flow
    C) Copy Activity
    D) Lookup Activity
  2. What language is used in ADF to create dynamic expressions?
    A) JavaScript
    B) PowerShell
    C) Data Flow Script
    D) Data Factory Expression Language
  3. Triggers in ADF can be used to:
    A) Send emails
    B) Monitor costs
    C) Schedule or respond to events
    D) Scale VMs
  4. Which trigger type is used for event-based pipelines?
    A) Tumbling Window
    B) Event Grid
    C) Schedule
    D) Manual
  5. ADF integration runtime (IR) is required to:
    A) Schedule triggers
    B) Connect to Git
    C) Perform data movement and compute
    D) Monitor pipelines
  6. Which Integration Runtime is best for accessing on-premises data?
    A) Azure IR
    B) Self-Hosted IR
    C) Auto IR
    D) SSIS IR
  7. Can ADF connect to both structured and unstructured data sources?
    A) Yes
    B) No
  8. What is the purpose of the Lookup activity?
    A) Perform joins
    B) Run SQL queries and return data
    C) Update tables
    D) Trigger notifications
  9. What is parameterization in ADF?
    A) Encrypting data
    B) Running SQL commands
    C) Passing dynamic values into pipelines or datasets
    D) Automating backups
  10. How does ADF handle retries when an activity fails?
    A) No retries
    B) Automatic infinite retries
    C) Retries can be configured per activity
    D) Only allowed in Data Flows

🧪 Advanced Level (21–25)

  1. Which ADF feature allows integration with Git repositories?
    A) Data Flows
    B) Integration Runtime
    C) Git Configuration pane
    D) Pipeline Trigger
  2. What is a debug run in ADF?
    A) Production test
    B) A test run that doesn’t publish to the live environment
    C) A run that optimizes performance
    D) A dry run that deletes data
  3. Which ADF feature is best for building CI/CD pipelines?
    A) Integration Runtime
    B) Azure DevOps or GitHub Actions
    C) Data Flow Debug
    D) Triggers
  4. True or False: You can use parameters and expressions in Mapping Data Flows.
    True
  5. What’s the best way to monitor pipeline executions in ADF?
    A) Azure Portal Monitoring tab
    B) SQL Profiler
    C) Excel
    D) Kusto Query Language (KQL)