最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

Parallel OData API calls using python asyncio to extract large data from SAP - Stack Overflow

programmeradmin3浏览0评论

I am trying to develop a code that will call the SAP OData API endpoints multiple times in parallel and fetch data in json. Lets assume that at a time we want to send 5 concurrent requests. Now in each request, the OData API attribute values are to be changed dynamically.

Below is the OData API url:

url = f"https://<server_ip>:<port>/" + "sap/opu/odata/sap/ZRSO_BKPF&$format=json&$top={top_value}&$skip = {skip_value}"

It is clear that top_value and skip_value will be changing in each API call. So the ideal urls for 1st parallel call (containing 5 API endpoints with different top_value and skip_value) would be like:

url_1 = https://<server_ip>:<port>/" + "sap/opu/odata/sap/ZRSO_BKPF&$format=json&$top=1000&$skip = 0"  #for 1st call skip value is 0
url_2 = https://<server_ip>:<port>/" + "sap/opu/odata/sap/ZRSO_BKPF&$format=json&$top=1001&$skip = 2000"
url_3 = https://<server_ip>:<port>/" + "sap/opu/odata/sap/ZRSO_BKPF&$format=json&$top=2001&$skip = 3000"
url_4 = https://<server_ip>:<port>/" + "sap/opu/odata/sap/ZRSO_BKPF&$format=json&$top=3001&$skip = 4000"
url_5 = https://<server_ip>:<port>/" + "sap/opu/odata/sap/ZRSO_BKPF&$format=json&$top=4001&$skip = 5000"

I have developed a code but that will do this serially using a for loop.

import requests
from requests.auth import HTTPBasicAuth
import json
import os

num_iter = 5
dataChunkSize = 10000
fldr_to_write = '/local folder/on the drive'
for i in range(1,dataChunkSize*num_iter,dataChunkSize):
  if i == 1:
    data = requests.get(url = url + "&$top = {1}".format(dataChunkSize) + "&$skip=0",headers=headers,auth = HTTPBasicAuth(usr,pwd))
    if data.status_code = 200:
      data_f = json.loads(data.text)
      with (os.path.join(fld_to_write,'bkpf_1st.json'),'w',encoding='utf-8') as j:
        json.dump(data_f,j,ensure_ascii=False, indent=4)
    else:
      data = _make_http_call_to_sap(url = url + "&$filter = {0}".format(flt) + "&$top = {1}".format(i) + "&$skip={2}".format(i+999),headers=headers,auth = HTTPBasicAuth(usr,pwd))
      if data.status_code = 200:
        data_f = json.loads(data.text)
        with (os.path.join(fld_to_write,'bkpf_{}.json'.format(i)),'w',encoding='utf-8') as j:
          json.dump(data_f,j,ensure_ascii=False, indent=4)

I want to use asyncio and aiohttp to replicate the above logic. Need some guidance on the same.

发布评论

评论列表(0)

  1. 暂无评论