Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Limit rows in table export to csv

Hi,

Is there any limit on number rows that can be exported in Qlikview qvw files.

Little background on my question:

I'm trying to export a table box with ~850,000,000 rows into a csv file in Qlikiew 11 desktop ( right click -> export-> save as -> csv). After 10 minutes, it fails to export. The table on Qlikview would be displayed with cross symbol indicating the calculation has failed.

I know,  I can solve the issue with various other ways. I'm just curios to know the limitation of the tool.

7 Replies
Not applicable
Author

Hi,

The maximum worksheet size is may be 1,048,576 rows by 16,384 columns

Not applicable
Author

Raja Kumar,

Why are u referring to worksheet size ?

its CSV file, can be opened in text editor. I'm interested to know Qlikview export limit

Not applicable
Author

This isn't a direct reply to your question but it is worth noting that Table Boxes are not very efficient when it comes to memory and CPU. They are one of the top causes of badly performing QlikView apps. 850m rows is a significant amount to be storing in a tablebox. I think your export issue is related to that. Throw enough RAM and CPU at the task and it might work, but is it really worth upgrading your hardware for something that QlikView isn't really designed for?

If you need to export 850m rows into a delimited text file (csv) then I would recommend doing it from the script instead using something like:

store Transactions into transactions.txt (txt);

I tested it with 2 million rows and it took mere seconds.

Not applicable
Author

I ran into this exact problem yesterday, What i noticed is for my 16GB machine. I am able to export only 350K out.

Two options I can suggest:

1) Export the data by selecting values in some filed which can break table box into three or four sets.

2) If everything is from the same table in QVW, then save it in the script as mentioned by Kai Hilton-Jones.

Anonymous
Not applicable
Author

I agree that it only makes sense in the script.

(I cannot understand a need to see a table with tens of thousands rows on the front end, not mentioning hundreds millions )

Not applicable
Author

Very true. As Michael states how can a human being realistically digest a table containing hundreds of millions of rows? That's exactly the reason why companies purchase BI platforms such as QlikView.

If you need to display transactional data then use one of these methods:

  • Conditional show and pre-filtering: only display data in a tablebox when the number of rows are less than say 1000; this forces a user to make some sensible selections first before being presented with targeted raw data
  • Document chaining and pre-filtering: link from one document to another, e.g. a high-level view with a link to a second dashboard showing transactional data. Only enable the link if sufficient selections have been made by the user. This method uses RAM efficiently when compared to dumping everything into a single QVW
  • Direct Discovery (QlikView 11.2): ad hoc queries to pull down transactional information on demand. DD 2.0 has just been released as part of 11.2 SR5
Not applicable
Author

I really liked your answer, definitely am new to this BI tool, am facing the exact same problems in my project with a millions of rows against RAM efficiency.