Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us to spark ideas for how to put the latest capabilities into action. Register here!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Job Running Slow

Hello
I need support ... I have a problem with my job, is very slow. I have a main job which contains various job children. The job prncipal queries for each record and is piped to the corresponding sub job. But this whole process is very slow.
I do not know what makes it slow is the implementation tRowGenerator component or if there are many levels that implement (I have 3 levels, three levels of sub jobs)

This is the structure of my job:
Main job:
tAS400Input-- >tmap-- >tSetGlobalVariable-- >tFilterRow1?filter--- >tRunJob1

Job children 1:
tRowGenerator1-- >tSetGlobalVar-- >tMap1-- >tFilterRow1 ? filter-- >tRunJob2
|
|-- > tRunJob3
Job children 2:
tRowGenerator1-- >tSetGlobalVar-- >tMap1-- >tFilterRow1 ? filter-- >tRunJob4
|
|-- > tRunJob5

I appreciate your help...
Labels (2)
5 Replies
Anonymous
Not applicable
Author

Hi
I don't know more details on your job, just have two points on current job design:
1. Why don't you finish the filter operation on tMap directly in the main job?
2. How many rows do you generate in tRowGenerator? If only one row, I suggest you to use tFixedFlowInput component.
There must be some place where you can optimize the job design.
Best regards
Shong
Anonymous
Not applicable
Author

Hi,
Can you post job screenshot ?
Do you use many rows and Iterate ? Try to replace them.
You can use TOS statistic to check where you loose more time.
Anonymous
Not applicable
Author

Thank you very much shong / tmaurin ...
Attached three pictures which show the design of the job. I use some tHashOutput do not know if this will make the process was very slow. Please tell me how I can optimize it.Also used in the entrance of each subjob a tRowGenerator.
My idea is to process each record, filter them according to business rules and values them according to their evaluation, but for now this process is very slow, 600 records are processed in 15 minutes, that's very slow.
Thank you!
Anonymous
Not applicable
Author

Hi
The job is a little big and it consumes so much memory, tHashOutput, tFilterRow and tMap, all of them consume memory. Try the following ways to optimize the job:
1. Output the records into a file using tFileOutputDelimited replace storing them into memory with tHashOutput.
2. Use tFixedFlowInput component replace tRowGenerator.
3. Store the lookup flow into disk replace memory.
4. Open the Run viewer-->advanced settings panel and allocate more memory to execute the job.
Best regards
Shong
Anonymous
Not applicable
Author

5. Try to move your filter from your filterRow to your tMap. You can certainly decrease the number of components used.