Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello everyone! Users need to export 2 millions rows with 4 columns from the qlikview report (access point). At the moment it takes approximately 1-2 hours. How can i optimize this report in the server for users?
Thanks in advance
Maybe your users could export in a text-file instead of excel but it's generally not a good idea to export such large amounts of data over the access point. Better would be to generate these data within the script and store it an appropriate storage.
- Marcus
I do a similar thing pulling in a large amount of data. I created 8 different QVD loader tables that pull data from SQL Server. These QVDs are built first; 4 are refreshed daily and 4 are incrementally loaded every 10 min. Having them separated allows me to set the desired updates and incremental loads. The main reporting app reads in the QVDs on a timed basis again every 10 min., pulling in the resident files taking approx. 6 min, as the QVDs are now optimized. Works well. My data set for my main QVD loader file is 4 million records with 154 fields and there are others that link to the main with similar sized data sets. I do file locking on the QVD Loaders, so main reporting app doesn’t execute and while main reporting app is pulling in the QVDs to resident I lock out the QVD loaders from executing.
the same problem with exporting to *.txt or *.csv. it takes a lot of time or report gives disconnect error.