Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

how to optimize exporting big data from qlikview

Hello everyone! Users need to  export 2 millions rows with 4 columns from the qlikview report (access point). At the moment it takes approximately 1-2 hours. How can i optimize this report in the server for users?

Thanks in advance

3 Replies
marcus_sommer

Maybe your users could export in a text-file instead of excel but it's generally not a good idea to export such large amounts of data over the access point. Better would be to generate these data within the script and store it an appropriate storage.

- Marcus

bjsellers57
Contributor II
Contributor II

I do a similar thing pulling in a large amount of data.  I created 8 different QVD loader tables that pull data from SQL Server.  These QVDs are built first; 4 are refreshed daily and 4 are incrementally loaded every 10 min.  Having them separated allows me to set the desired updates and incremental loads.  The main reporting app reads in the QVDs on a timed basis again every 10 min., pulling in the resident files taking approx. 6 min, as the QVDs are now optimized.  Works well.  My data set for my main QVD loader file is 4 million records with 154 fields and there are others that link to the main with similar sized data sets.   I do file locking on the QVD Loaders, so main reporting app doesn’t execute and while main reporting app is pulling in the QVDs  to resident I lock out the QVD loaders from executing.

Not applicable
Author

the same problem with exporting to *.txt or *.csv. it takes a lot of time or report gives disconnect error.