Total Page Preview:   000000010019

What is Pipeline in Azure Data Factory

In this article we will discussed about Pipeline In Azure Data Factory.

Pipeline in English:

 

In Azure Data Factory, a pipeline is a logical grouping of activities that together perform a specific task. It represents a workflow that moves and transforms data from one place to another.
 
A pipeline consists of one or more activities that can be executed sequentially or in parallel. Activities can include data movement, data transformation, and control flow activities.
 
Data movement activities are used to read data from a source and write it to a destination. Data transformation activities are used to modify the structure or content of the data. Control flow activities are used to define the order in which activities are executed, add conditional logic, or set up iterations.
 
Pipelines can be scheduled to run at specific times or triggered by an event. They can also be parameterized to allow for flexibility in the data processing flow.
 
Pipelines can be designed using Azure Data Factory's graphical user interface or JSON code. They can be monitored, managed, and triggered using the Azure Data Factory portal or REST API.
 
Overall, pipelines are a key component of Azure Data Factory that enable users to orchestrate complex data workflows and automate data processing tasks at scale.
 
 
Pipeline in Hindi:
 
Azure Data Factory (ADF) me "Pipeline" ek fundamental unit hoti hai jo data integration or data transformation ke liye use hoti hai. Pipeline ek sequence hota hai jisme ek or zyada activities hote hai jo data ko move karte hai, transform karte hai or process karte hai.
 
ADF pipeline ka use karke, aap data sources se data ko extract kar sakte hai or use destination systems me load kar sakte hai, data ko transform karke mapping karte hai or batch processing bhi kar sakte hai. Pipeline me activities ek after the other execute hote hai, jisme kuch activities parallaly bhi execute ho sakte hai.
 
ADF pipeline me kai types ke activities hote hai jaise Data Movement, Data Transformation, Control or Integration. Data Movement activities jaise Copy activity or Data Flow activity ka use data movement ke liye hota hai, Data Transformation activities jaise Databricks activity or HDInsight activity ka use data ko transform karne ke liye hota hai, Control activities jaise IF condition or For Each loop ka use pipeline execution ko control karne ke liye hota hai or Integration activities jaise Web activity or Lookup activity ka use external systems se data ko integrate karne ke liye hota hai.
 
Pipeline ke andar in activities ko configure kiya jata hai or isko trigger kiya jata hai taki isko run karne ke liye ready ho jaye. ADF pipeline ka use karke aap ek scalable, reliable, or secure data integration solution develop kar sakte hai.
 
 
 

 

Thank You

About Author

Brijesh Kumar

Database Developer

I have more then 6 years Experience in Microsoft Technologies - SQL Server Database, ETL Azure Cloud - Azure SQL Database, CosmosDB, Azure Data Factory, PowerBI, Web Job, Azure Function, Azure Storage, Web Apps, Powershall and Database Migration On-Premise to Azure Cloud.
LinkedIn : https://www.linkedin.com



Comments

Bunker
17-Nov-2018
Everyone loves what you guys are up too. Such clever work and coverage! Keep up the very good works guys I've incorporated you guys to my blogroll.

                           
                           

                           

                           

Facebook User: