Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
SRatna
Contributor
Contributor

tRestClient | Sending data to tRestClient in BULK/batches?

Hi All,

We have a scenario i.e. we need to send the records to webservice (tRestClient) in BULK/batches.

For example, 

Current Issue :

I have created job(web service) and it is also able to export data from postgresDB and insert same into a Cache Service but issue is of API calls.

Suppose i have 10 records in postgresDB and all are being transferred to the Cache service so it is sending 10 API calls to the service.

 

Means number of API calls = Number of records.

 

so I want to know is there any setting or variable through which i can reduce these API calls. Due to the service limitation we can't send more than 35000 API calls to at a time and we have huge data to migrate. Is there a way to send across all the data in a single API call to the serive?

 

OR

 

Any solution for sending in batches?

 

Labels (5)
2 Replies
gjeremy1617088143

Hi, if you can use sql query in greenplumInput you can do this :

first you made a count of the line of your select then you divide it by your fetch and you add1 to have the number of execution and stock it to a globalMap

then you use a trowgenerator with the number of row generated equal to your variable and you made a tflowtoiterate to iterate for each fetch by 2000.

then you can use a fetch sql request or a between if fetch is not supported and you can enter the value of row to be fetch (here 2000) and add this number each time you iterate on an other GreenplumInput .

It will call the DB 4 time , one for the row number , 3 for the fetch.

It will execute the fetch query 3 times : 2000 rows,2000 rows,1500 rows.

Send me Love and Kudos

SRatna
Contributor
Contributor
Author

Hi Jeremy, Thanks for your response. I am already using a SQL to fetch data from Greenplum database. Can it be done alongwith that?

 

More efficient for us would be to have a single API call so all records are loaded in the same call. Is there a way I can send all records at once, like a BULK load?

 

Current Issue :

I have created job(web service) and it is also able to export data from postgresDB and insert same into a Cache Service but issue is of API calls.

Suppose i have 10 records in postgresDB and all are being transferred to the Cache service so it is sending 10 API calls to the service.

 

Means number of API calls = Number of records.

 

so I want to know is there any setting or variable through which i can reduce these API calls. Due to the service limitation we can't send more than 35000 API calls to at a time and we have huge data to migrate.