Loading...

shape shape shape shape

Azure Data Factory

Azure Data Factory Online Training | ADF Training Course Details

What is Azure Data Factory?

Azure Data Factory orchestrates the movement and transformation of data between various data stores and compute resources. You can create and schedule data-driven workflows (called pipelines) that can ingest data from disparate data stores. You can build complex ETL processes that transform data visually with data flows or by using compute services such as Azure HDInsight, Azure Databricks, Azure Synapse Analytics, and Azure SQL Database.

The purpose of Data Factory is to retrieve data from one or more data sources, and convert it into a format that you process. The data sources might present data in different ways, and contain noise that you need to filter out. The interesting data might not be in a suitable format for processing by the other services in warehouse solution, so you can transform it.

Azure Data Factory Key Components

Azure Data Factory consists of a number of components, that together, allow you to build the data copy, ingestion and transformation workflows.

Create pipelines to execute one or more activities. If an activity moves or transforms data, define the input and output format in datasets. Then, connect to the data sources or services through linked services. After you have created a pipeline, you can add triggers to automatically execute it at specific times or based on events.

Azure Data Factory (ADF) Online Training Course Content

  • Module 01 - Non-Relational Data Stores and Azure Data Lake Storage
  • Module 02 - Data Lake and Azure Cosmos DB
  • Module 03 - Relational Data Stores
  • Module 04 - Why Azure SQL?
  • Module 05 - Azure Batch
  • Module 06 - Azure Data Factory
  • Module 07 - Azure Data Bricks
  • Module 08 - Azure Stream Analytics
  • Module 09 - Monitoring & Security

Request For a Free Demo

bg

Start Your Best Online Classes With Us