Qlik Community

Qlik Sense Cloud Discussions

Masi_Sahargahi
New Contributor III

export to excel is very slow

HI all,

I have a straight table which has about 270,000 records and 93 fields in columns,

when I export it to excel it never finished.

anybody can help me out of this?

5 Replies
MVP & Luminary
MVP & Luminary

Re: export to excel is very slow

By this amount of records and columns it must be slow and you should rather calculate with hours instead of minutes for the export. Also it might be possible that you hit any limitation within the process and the export breaks.

Instead of exporting it to Excel you could also export it as csv which has no problem to open in Excel. Another possibility would be to store these data within the load-process.

- Marcus

Masi_Sahargahi
New Contributor III

Re: export to excel is very slow

Thanks for your answer @marcus_sommer

but I don't have any idea about the last solution you suggested?

how can storing data within the load process helps? 

Highlighted
MVP & Luminary
MVP & Luminary

Re: export to excel is very slow

It means just to build this table within the datamodel and then:
store table into Export.csv (txt);
If most of the columns are dimensions it should not be too expensive.
Masi_Sahargahi
New Contributor III

Re: export to excel is very slow

but the end users want to export the  excel file when ever they want...

MVP & Luminary
MVP & Luminary

Re: export to excel is very slow

IMO it's the wrong way to provide the users with the data and this is independent from the used tool (I doubt that any tool would be able to calculate and deliver this amount of data from a data backend + frontend to excel in a prompt manner).

Each tool needs to collect and to combine all these dimensions and perform on it the calculation of the measures, rendering the results within a visualization and then translating it into a different data-format and sending it over the network to the target. And here by around 270k of records and 93 columns it will be approximately 25M cell-values with at least 250 MB of data but probably it will be more and be rather by around 1 GB. Many of the process-steps will be probably single-threaded and therefore the whole task will need some time and it's quite probable that it runs in any timeout and/or hit any limitation.

Beside this I don't believe that the users really need all these columns and records at the same time. This means if you could leave 30 columns and splitting the export into 2 - 3 parts and/or changing the target from xls(x) into a csv you might get it to a size which will work.

But I suggest to consider the before mentioned store-approach carefully if it's not the better approach especially as I doubt that the users do anything direct analyses with the data else they will feed any kind of further processing with it. It might need some efforts to change some logics here to use the mentioned csv as a source but I think any other way will be more expensive in the end.

- Marcus