Skip to main content
Announcements
See why Qlik is a Leader in the 2024 Gartner® Magic Quadrant™ for Analytics & BI Platforms. Download Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

10 million rows limit?

Hello guys(and gals)

I have a curious problem. There are 2 tables that are pulled into qlikview (ver 8.2) one having about 6million rows and the other about 4million rows.

If bring in any of the two by itself, then there is no problem and the reload happens. I have ensured that there are no joins between the two and expect that without joins the reload would happen. However it doesnt. I'm running this on a server with 3gb of Ram and 4gb pagefile.

I have 2 questions.

a) Is there a limit to how much data qlikview can take in based on Ram and pagefile? (because if i do a limited load of say 5million lines, it works and both tables reload)

b) how can i go around this without having to delete data ?

thanks

5 Replies
prieper
Master II
Master II

Have approx the same behaviour on a standard PC/Notebook. Would not consider 3 GB or RAM as sufficient for a server, only solution would be to invest in RAM or to reduce data.

Peter

Not applicable
Author

Thanks. Seems that will have to be the way although i was hoping for a "software" solution.

Cheers.

Anonymous
Not applicable
Author

It's not the QV limit - it's Windows limit, 2GB per process. I'm not sure that adding RAM by itself will make any difference. More reliable is to use 64-bit Windows.

johnw
Champion III
Champion III

Right. If you're on 64-bit Windows and 64-bit QlikView, adding RAM is the "right" solution. If not, adding RAM won't help. Either way, you may benefit from tuning the memory used by the QlikView application itself. Do you need ALL of the fields you are loading? Do you have long key values that could be autonumbered? Are you joining data in any particularly inefficient way? One thing to look at is the .mem file for your application - document properties, general, memory statistics. Then read in the .mem file as a data source to a new application. It will tell you where all the memory is being used.

Not applicable
Author

Thanks guys. I ve actually tried to optimize and summarize the incoming data, its just that there are very many transactions and we need the details in each row. As i wait for Ram i've done a couple of things. First i purged some of the data and secondly, split the reporting into two distinct Reports so that we never open both tables at once. Its not the ideal solution but works for now.

Thanks again for the replies , i appreciate it.