Skip to main content
The way to achieve your own success is the willingness to help somebody else. Go for it!
Showing results for 
Search instead for 
Did you mean: 
Contributor II
Contributor II

Exporting table to CSV does not export all data

 Hello Qlik Community!


I have a visualization extension that exports a selected master visualization via a button click. The user is also able to choose the format they would like the export in (OOXML or CSV).



When exporting a table with around 230 columns and 280,000 rows to CSV, the exported file only has around 22,000 rows. However, if I export the table to OOXML, it exports all columns and rows.


What I've tried:

  • Exporting via exportData method (Get object/master vis, then model.exportData)
  • Getting the hypercube and then exporting the data (returns error since hypercube is too large due to the amount of rows in the table) via this extension


Is there another way I can try exporting? Or any clue on what may be causing this? Any help in tackling this issue would be appreciated.

Labels (3)
6 Replies


1. Is this a 3rd party extension?
2. Usually issue happen with excel and csv exports without issue.
3. Can you creating a separate app with the just the data you want to export?

@BrionBaskerville , client-managed or Qlik Cloud? The CSV call to exportData does have record limitations on it (see:, albeit, I think it's greater than your example.

What's the problem you're trying to solve? 64 million cells of data seems pretty large. 




Contributor II
Contributor II


  1. Yes, I created it
  2. Yes we haven't had any issues. This issue came up recently and it has only been noticed when its a large amount of data
  3. I'll try this, thanks!



Client-managed, Qlik Sense Enterprise for Windows.

Yes , I did see this earlier, and thought I was well under like you mentioned so I figured that shouldn't be an issue.

Indeed, it is quite large, and I don't have a good use case for it other than we have the data in Qlik and we were requested to export it via CSV.

MVP & Luminary
MVP & Luminary

Just some ideas ... Beside the record-limitation might be exists a size-limitation and/or any timeout-setting. If so a decrease of the number of columns should lead to more exported records.

Quite old-school but maybe further helpful could be a check of the data where the export breaks if there is anything unusual or strange. If it's possible to configure the file-format you may add a "no eof" and/or adding/removing/adjusting the msq-settings or playing with similar settings.

- Marcus

Contributor II
Contributor II

@marcus_sommer Thank you for the suggestions! Will give them a try. I also thought about there being a size limitation, but when trying to export another table, I was able to get a larger file(perhaps there were less columns than the initial table, I'll have to look more into it). Timeout setting seems plausible as well.


Just to confirm that what I'm doing is correct before I try anything else, as I understand it the CSV export needs the path to the qhypercubedef to work correctly, is that right? If I have the ID of the master item/object/visualization I would like to export, how would one get the path to the qhypercubedef for that item via the API? I tried seeing if supplying the qhypercube from getObjectProperties would work, but it did not.



let btn = {
    item: 'RJRmpJ'
    format: 'CSV_C',
    filename: '',
    download: true
app.getObjectProperties( btn.item ).then(function(model) {
    model.exportData(btn.format,, btn.filename,
I've also tried it with just '/qHyperCubeDef' which actually exports but ofc not the whole table (if the table is large enough). To which I'm also thinking perhaps it's only exporting the data thats been loaded and not the rest that might not have been (when you scroll for example)?
MVP & Luminary
MVP & Luminary

I have no own experience with exporting any content from Sense and therefore I couldn't say if there are possibilities within the export-routine.

If you could really export larger data-sets it shouldn't be size-related - whereby the pure number of rows/columns might not be meaningful enough because the content of the cells could be quite different - numbers vs. (larger) strings. If you don't have useful test-data you may create ones in an extra app with some combined rand() statements within an autogenerate-load.

The above testing-method would surely confirm or excluding any size-related matter. Another check could go to any timeouts - if it breaks each time to a quite equal time it would be a very valid hint to a timeout. This may then be related to the available hardware- and network/storage-resources but also to the data-model + object within - means the calculation time for the object is also relevant. If it's timeout-relevant you may look if they could be adjusted in any way and/or if the data-model + object might be optimized.

Before checking this you may just create a new app with these data and a new export-object and checking if it worked and if it hints for any corruption within the old app.

If none of it could be proved any incompatibility between the data and the export-logic becomes more likely and it might be helpful to identify the values where it breaks.

- Marcus