Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Performance on a job

Hi,
For me, Talend is a very good solution but now, I try to do performance tests. And, my results are disastrous.
I join a screen of my job which discribe my business rules.
My project is to replace a loader done with Access.
With my current loader, the execution time of this job is 4minutes and 30 seconds.
With Talend, the execution time of this job is 1 hour and 28 minutes.
How can I improve the performance ?
Thx.
0683p000009MCHG.jpg
Labels (2)
29 Replies
Anonymous
Not applicable
Author

Hi suzchr,
There is a special reason using a tBufferOutput ? Can't you load directly in Access instead of loading in the buffer first ?
Anonymous
Not applicable
Author

Yes, it's done but the best time get is 22 minutes. I try to use 2 jobs to improve this time.
After my test I can say that when you write on Access, the best commit value is 125 000 rows, but if I write on Access in the same job that I read in my data warehouse, with a commit of 125 000 rows the time is 6h 40minutes...
Anonymous
Not applicable
Author

For help I add two screen shot which shows the two jobs.
0683p000009MCS4.jpg 0683p000009MCOH.jpg
Anonymous
Not applicable
Author

Hi suzchr,
Try to test with only one tMap,it's accept many input and output ?
you may put a better Expression key to improve performence.
Regards
Anonymous
Not applicable
Author

I don't have a key define on my schema. Do you think if I define a key I improve the performance ?
amaumont
Contributor III
Contributor III

Be careful, don't confuse between the key column into a schema and Expression key column into tMap.
Check the Key column from a schema will not improve performance into tMap, yet set an Expression key into one of your lookup will do.
Anonymous
Not applicable
Author

Ok I define a key on my second TMap. I will send the results after.
Thx.
Anonymous
Not applicable
Author

Have you tried to load a temporary table as output in the first job ?
Then you could map it with the output table in the 2nd job.
Anonymous
Not applicable
Author

Hi,
Make sure that the metadata schema in yout tMap and in your tAccessOutput match exactly (including datatype) with the schema in your final database
_AnonymousUser
Specialist III
Specialist III

hi can some one tell me how to increase the speed of the rows(no.of.rows) passing in between source to destination...for eg...suppose if a csv file consists more than 1lakhs rows means while running the job..the rows passing in a speed of only 200 rows/s to 300rows/s...how to increase it to more than 1000rows/s...pls anyone reply me...