Azure Data Factory Pipeline Manual Execution (PowerShell)

We looked at the relationship between Azure Data Factory Activities and Pipelines in the previous post. After creating the pipeline, we need to execute or run the pipeline to begin the task that the pipeline is designed to perform. A pipeline run, is an instance of pipeline execution.

The simplest way to run a pipeline is Manual or On-demand execution. There are various ways to do this. We will look at the PowerShell command to execute an Azure Data Factory Pipeline programmatically using the new Az PowerShell module:

Invoke-AzDataFactoryV2Pipeline -DataFactory $df -PipelineName "AsdfPipeline" -ParameterFile .\PipelineParameters.json

Where “AsdfPipeline” is the name of the pipeline that is being run and the ParameterFile parameter specifies the path of a JSON file with the source and sink path.

The format of the JSON file to be passed as a parameter to the above PowerShell command is shown below:

{
  "sourceBlobContainer": "MySourceFolder",
  "sinkBlobContainer": "MySinkFolder"
}

Please note that Azure Data Factory pipelines can also be triggered programmatically using .NET, Python or REST API. Also, it is possible to run the pipeline manually from the Azure portal. Besides, there are multiple ways to automate the execution of the pipeline. We will look at these options in a future post.

Reference: https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers

2 thoughts on “Azure Data Factory Pipeline Manual Execution (PowerShell)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: