Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Jobs running slow

Hie
Could someone help me please: I have a job with an input delimited file which has about 500 000 rows could be more in future, I designed a simple job with so many lookup tables and processes using TlogRow and the Tmap being the main one. This job is taking long to run, it is taking more than a day or two to run.
Could someone tell me the best way/components to use for a job that will consists of inputdelimited csv files, MySql Tables to run faster.
I cant seem to attach the screen shot right now, if possible i can email you, if you think you can help
Many Thanks
Labels (2)
2 Replies
Anonymous
Not applicable
Author

There are various approaches you might take but you need to check where it is actually bottlenecking right now. Do the lookup tables have many differnt values or just a few? Are they cached? Indexed etc? If your mysql backend is on a more powerful system then you might stage the input data and then use a join (outer if necessary) to do the lookups and produce a new data stream to feed the rest of the job.
T
Anonymous
Not applicable
Author

Hi,
Get ride of tLogRow, it is a debugging component to trace on console.
You should only use it on small amount od data in order to test (<10000 rows)
tLogRow will slow down the whole data flow to around 100row/sec...
benjamin