Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
how would I store that much data inside a context var or global var? What datatype would I use and how would I get all the data into that one var?
I am also facing same issue, did you archived this issue.if yes, please let me know the solution
You do not have the described behaviour in native Talend. But it can be done with some components. Just wanted to share my solution:
Use case: need to send 122,726 rows to a PowerBI table via API, which has a limitation of 10,000 rows per call, resulting then in 13 calls; notice that the last call will only upload 2,726 rows.
The trick is just to create the correct exit condition for a tLoop that we will then use to split the 122K flow and shoot the call every 10K rows.
Some Context variable are used to support the process, including a variable for the 10,000 split. This implementation indeed will work with an arbitrary number of chunked rows.
The first tJava just sets the valid condition to allow the first tLoop iteration.
From now on, the tLoop will check against a Context variable that carries the number of processed rows during an iteration. The idea is that we expect this number to be 10,000 every time; if, at a certain point, the number of processed rows is lower than 10,000, then it is time for the tLoop to exit.
Every iteration needs to re-set the following:
Rows counter, Starting and Ending Point for the rows to be considered during the iteration.
Extract the 10,000 rows we are interested in, during the iteration
Increment the rows counter by 1 at every row, until reaching 10,000. If it does not reach the 10,000, it means that the rows are finished and the tLoop will exit.
Done! 13 iterations: 12 with 10,000 rows and 1 with 2,726.
Notes: