Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
_AnonymousUser
Specialist III
Specialist III

row limit 800,000 TOS 6.1.1

Ultra simple job: csv source, map, oracle out.  The csv source is 2M rows (approx.) the job stops at exactly 800K, no error message, no error indicator. I cannot find and limits in the objects that are set or in the job or the project. Please suggest ideas or a diagnostic that can be inserted, enabled.

Labels (2)
6 Replies
Anonymous
Not applicable

Hi,
Have you already checked out " Die on error" option on tOracleOutput component to see if there is any error message printed on console?
Did you  try to store the data on disk instead of memory on tMap?
Best regards
Sabrina
Anonymous
Not applicable

Hi,
We have designed a job in Talend to process bulk data and update those records in salesforce.
When We are trying to execute bulk records, it is failing due to trigger/Code in the system.
Instead of optimising the code, is there any component or any possible way to bypass the records and execute.
That is to read first set of 200 records, pass those 200 records till the end of the process, then take next set of 200 records and process till end and then remaining records by bypassing records.
Is there any solution for this in Talend?
Regards,
Reji
Anonymous
Not applicable

Hi Reji
When We are trying to execute bulk records, it is failing due to trigger/Code in the system.

Could you please show us your current job design screenshots? Did you use bulk component for salesforce?
Best regards
Sabrina
Anonymous
Not applicable

Hi,
Below is the error actually we are getting,
CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:Network_Profile_AfterInsertUpdateUndelete_BeforeDel: System.LimitException: Apex CPU time limit exceeded:--
yes, we are using bulk component for salesforce. 
Executing 100 records manually is not causing this error, but running as batch it is causing this error.
below is the screenshot of bulk component connection.

0683p000009MHCy.png


My doubt is that,  is there any option to read first set of 200 records, pass those 200 records till the end of the process, then take next set of 200 records and process till end and then remaining records by bypassing records.
Is there any solution for this in Talend?
Anonymous
Not applicable

Please find the screenshot,

0683p000009MHCy.png
TRF
Champion II
Champion II

Hi,
Not a good choice.
You should have a look on how to bypass apex class and other active components when loading high data volume on Salesforce.
This is not a Talend subject, check Selesforce Developer forum instead.
Regards,
TRF