Hey,
I am pretty sure talend should be able to do this task relatively easily, but I am not sure the best way to go about it.
I have 100,000 rows of data, but an API I am calling can only take 100 rows of data per API call.
I would like to execute an API call on 100 rows each time until I have looped through the full 100,000 row data set.
Any advice/recommended components on going about this is much appreciated.
Thanks,
Brian
I am using the tSoap component to perform the API call. I am thinking use tFlowToIterate --- something that counts to 100 --- tIterateToFlow (some how batch it into groups of 100) and then execute off each batch
I think the main problem is how to aggregate the 100 rows of data to pass in one tSOAP. If you can do that, then calling the tSOAP only for every 100th row is quite easy e.g.
(construct your SOAP call aggregating the 100 rows of data) --> tFilterRow advanced: Numeric.sequence("s1",1,1)%100==0 --> tSOAP --> (reset your SOAP call construct to start with the next set of 100 rows)
Don't forget OnSubjobOK --> tSOAP for the remaining data rows in excess of a multiple of 100.
I am thinking about creating a temp table or temp file and number the rows. Then use tFilter and only allows rows with numbers 100 or less to be sent, the rest get sent back to the temp file renumbered.
Loop until the temp file/table is empty.
@jlolling, thanks, for getting back to me so fast!
Sorry to be a pain but would you mind elaborating a little more, im still fairly new to the tool. So inside the tMap, have a filter in there?