Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have 40 GB of data it taking more than 10 hours to load....can you pl help me by sharing some tips to optimize the performance
1)drop temp tables
2)try to avoid joins
3)make sure no syn keys and circular reference
4)if possible reduce no of tables in data model
5)using mapping load in possible cases
6)write conditions in script level
Hi,
Optimization is complex and depends much on data of the specific case. Is it a single load statement that takes so long time? Is there a lot of logic (transformations) being applied when reading?
The single thing that will probably save most time is if you can use any form of delta loads. This means that make an initial load and dump the data to a qvd. At next reload you only extract the new/changed lines from the source and append/update the qvd.
If you share the load script it is easier to give tips.
Thank you both
Hi,
try to set up a store procedure in the database and run it to upload to several qvd's through incremental loads.
Ensure that the tables in the database have indexes to a fast loading.
Take a look at this...
Regards
André Gomes
Hi,
Go through the attached documents
Regards
ASHFAQ
To improve and update distribution performances to users make care you have installed MS KB2600217.
what connectivity do you use?
Have a look of this document too: http://community.qlik.com/message/435899#435899
In addition to other recommendations, have you converted all your input files into QVD's first?
Regards,
Neil