Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
Soumya_M
Contributor
Contributor

Delimited file output

Hello, so I've created a talend job where I insert data to a table on a daily basis [tDBRow] and execute select statement [tPostgresqlInput] on the data to get output in the form of .csv file [tFileDelimited]. It looks like this -

0695b00000kV3guAAC.png

If i run individually all jobs are executed within few seconds, but - if i put it in sequence and run the job is taking more time [like half an hour],

there isn't any issue with the components. They have simple insert and select statements. I need your help as to why it is taking half an hour to run the whole job [where as individually few seconds]. Also, after the second OnSubjobOk is where it is taking majority of time. Can anyone help me regarding this ?

Labels (3)
2 Replies
Anonymous
Not applicable

Hello,

We need a little bit more information to address your performance issue.

- How many rows are treated by your job ?

- Your databases are local or on network ?

Before launching the job, you could check the "Statistics" box to see where the data-flow is slow( We are suspecting dataflow is slowing at the entry point-->DB input).

Best regards

Sabrina

Soumya_M
Contributor
Contributor
Author

query runs on a table of around 500k odd rows and db is on local machine. Okay, Statistics box as in tStatCatcher Statistics ?