I have a Azure Synapse pipeline which is triggered by a storage event. In pipeline trigger we have defined 2 variable Container name whose value is flatData and Blob path begins with whose value is inputFiles/
What it should do, when files (e.g. market_orientation.csv) received under flatData/inputFiles/, it should read them and move it other location. Right now what is doing once a file is received under flatData/inputFiles/ it gets triggered but while reading the file it tries to read it from flatData rather than flatData/inputFiles/ Somehow it is ignoring Blob path begins with while reading the file.
I have a Azure Synapse pipeline which is triggered by a storage event. In pipeline trigger we have defined 2 variable Container name whose value is flatData and Blob path begins with whose value is inputFiles/
What it should do, when files (e.g. market_orientation.csv) received under flatData/inputFiles/, it should read them and move it other location. Right now what is doing once a file is received under flatData/inputFiles/ it gets triggered but while reading the file it tries to read it from flatData rather than flatData/inputFiles/ Somehow it is ignoring Blob path begins with while reading the file.
Share Improve this question edited Apr 5 at 0:17 qkfang 1,7851 gold badge1 silver badge20 bronze badges asked Mar 13 at 10:15 GD_JavaGD_Java 1,4516 gold badges26 silver badges42 bronze badges 1- It sounds like your trigger is working, but something in your pipeline setup maybe wrong. But you've provided no screenshots / details of how your pipeline is setup. Usually with a blob creation trigger - you add 2 pipeline paramaters 'Folder' and 'File', and the trigger passes @triggerBody().folderPath and @triggerbody().filename to those parameters. You then use those parameters in any pipeline activities. Are you doing that? – Celador Commented Mar 14 at 9:35
1 Answer
Reset to default 0It seems like you have opted for the wild card file path in your copy activity which might be the reason for copying all the files in the given container.
The trigger will only trigger the pipeline when the uploaded/modified file falls in given filters but reading and copying the file will also depends on the activity configurations.
To read and copy the same triggered file, you need to use the trigger parameters like @triggerBody().folderPath
and @triggerBody().fileName
in your pipeline.
First create two dataset parameters folderpath
and filename
of string type in your dataset and use those in the container name and file name as shown below.
Similarly, do the same in the pipeline as well without any default values. During the creating of the trigger, it will ask whether you would like to provide any values for these parameters or not. Here, give the trigger parameters to the pipeline parameters.
Now, in the activity, give the pipeline parameters to the dataset parameters. For sample, I have used lookup activity here. You can do the same with copy activity or dataflow as well.
Upon upload/modification of the file, these parameters will get the folder path and file name of the file, and those values will be propagated to the dataset and activity. You can confirm the same from the pipeline run and activity input details.