最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

azure data factory - Best way to pass rows from SQL table (more than 5000 rows) in a web activity body (ADF) - Stack Overflow

programmeradmin2浏览0评论

I am using ADF lookup activity (which has thousands of rows in it by month). I need to be able to pass the output of the lookup activity in batches to a web activity in JSON format expression by passing month as a parameter.

@json(concat('{"Data":',string(activity('Lookup1').output.value),'}')) 

How do I accomplish doing this? Because lookup has a limitation of reading only 5000 rows.

Is there a way to read all the thousands of rows put in a CSV file, and read it back from the data lake to the web activity?

Or loop through as shown in this YouTube video. Which is a good option?

Ideally, I would like to read all the rows back in one shot by month and pass it to the body of the web activity. If that is not an option, then have to use loop.

Please advise which is a good option or if you have done this in the past.

I am using ADF lookup activity (which has thousands of rows in it by month). I need to be able to pass the output of the lookup activity in batches to a web activity in JSON format expression by passing month as a parameter.

@json(concat('{"Data":',string(activity('Lookup1').output.value),'}')) 

How do I accomplish doing this? Because lookup has a limitation of reading only 5000 rows.

Is there a way to read all the thousands of rows put in a CSV file, and read it back from the data lake to the web activity?

Or loop through as shown in this YouTube video. Which is a good option?

https://www.youtube/watch?v=Y4yFXGbhYZI

Ideally, I would like to read all the rows back in one shot by month and pass it to the body of the web activity. If that is not an option, then have to use loop.

Please advise which is a good option or if you have done this in the past.

Share Improve this question edited Mar 15 at 8:33 marc_s 756k184 gold badges1.4k silver badges1.5k bronze badges asked Mar 15 at 7:52 2023hpy2023hpy 356 bronze badges 2
  • Can you provide your sample input and expected output? – Rakesh Govindula Commented Mar 15 at 13:04
  • "Data":[ { "VERSION": "current", "DATE": "202412", "AMT" :"11.22" }, { "VERSION": "current", "DATE": "202411", "AMT":"12.11" }, and so on and so forth for 50,000 rows. Data is retrieved in json format shown above from sql table and the same is passed to the body of the web activity. – 2023hpy Commented Mar 16 at 2:26
Add a comment  | 

1 Answer 1

Reset to default 1

Is there a way to read all the thousands of rows put in a CSV file, and read it back from the data lake to the web activity? Or loop through as shown in this YouTube video. Which is a good option?

Here is another alternative which might be less complex than showed in YouTube video:

To pass rows from SQL table (more than 5000 rows) in a web activity body (ADF) using Until Loop

First Define Variables three variables:

  1. Iterations - (Determined by a lookup task, represents total iterations needed)
  2. Index - (Used to track progress)
  3. Count - (Used to calculate pagination offset)

Then take lookup activity and get the total no rows of that table to process (in your case it will be greater than 5000)

The take set variable to calculate the iterations needed to process the total no of rows. Here I divided total no of rows by 30 in your case it will be 5000.

Now use an Until loop with the following condition: @equals(variables('count'), variables('iterations'))

  • This ensures the loop runs until count reaches iterations.

Now under Until loop take lookup activity to fetch the 5000 rows for each iteration and the count variable is used as an offset in the lookup. (Here I took 30 in your case it will be 5000)

@concat('select * from sample_data1 ORDER BY id ASC OFFSET ',string(mul(30,int(variables('count')))),' ','ROWS FETCH NEXT 30 ROWS ONLY')

After processing the first batch, increment the index variable: @add(int(variables('count')), 1)

Update count to match index so that the next batch starts at the correct offset.

与本文相关的文章

发布评论

评论列表(0)

  1. 暂无评论