Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
rohit1804
Contributor III
Contributor III

how to improve performance of job having 30 million of data into the lookup file.

Hi All,

I have job which has 30 million records in lookup and I am trying to run the job on the unix server but because of huge amount of data in lookup file its taking so much time I have ran the job at morning and still it is running. Pleas e help me to get rid of this error

Labels (1)
6 Replies
Anonymous
Not applicable

You will need to tell us a bit more and preferably give a screenshot of your lookup configuration (I assume you are using a tMap). Maybe give us a screenshot of your job and your tMap configuration.

rohit1804
Contributor III
Contributor III
Author

hi,

thanks for reply and I don't have permission to take screen shots of job because its company policy I hope u are understanding and yes I am using tmap overthere

 

manodwhb
Champion II
Champion II

@rohit1804 , Have you specified the store on temp in tMap advanced settings and even you can try to increase the max buffer size .

ankit7359
Creator II
Creator II

Hi @rohit1804 ,

You can try any of the below steps -

1. Check Multi - Thread Option in Job settings & Change the Threads as per necessity(Recommended is 3).

2. You can enable Set Parallelization on the source component of the job which would be another factor to improve ur performance.

3. Or else, You can Increase the Jvm arguments to take in 1024M to 4096M. If u have higher Ram capability then u can go for 5G's. 

 

Thanks,

Ankit

Anonymous
Not applicable

in dbinput component go to advanced settings and use cursor size and based upon the rows to be processed u can mention the cursor size,sure it will make lots of difference and also in tmap make temp data to store also increase the jvm values of ram

Anonymous
Not applicable

How many rows loading into the tMap through the Main source and how have you got the lookup configured? If you have a limited number of Main rows you might want to reduce the number of lookup rows by using the "Reload at each row" setting. This will allow you to set the value(s) that you will join on to the globalMap. You can use these in your lookup data query. When you fire your lookup query it will only bring back data relevant to your Main row.