Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
We are using Talend Data Integration 7.x. I have an issue with one of my requirement.
My pipeline consists of components tDBInput -> tMap -> tDBOutput. I have a huge data coming in from source tDBInput(around 10 million) , so I had to filter the data for example (1 to 1,000,000 & 1,000,001 to 2,000,000).
Can I process each filter sequentially such that it will not fail.
I tried to use row-->Iterate: unable to use row(iterate) from tDBInput to tMap.
Please suggest me on the better approach, I cant run in parallel as the data is huge and it might affect the server capacity.
@xdshi any suggestions ??
@srkalakonda, does this help?
If so, thank's to mark your case as solved.