Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Was loading almost 500 Million records in QlikView Dashboard Table and got a message "Object out of Memory"

Hi All,

I am doing load test and I have 128 GM RAM.

I was trying to load almost 500 Millions records in a QlikView Dashboard report and got a message "Object out of merror" but I can see Ram was only in 50 % of use.

Anyone had similar issue, could you please explain it.

Another thing I noticed, When trying to browse a saved QlikView Report It is taking longer time to load fact data in to a Table under report.

8 Replies
MK_QSL
MVP
MVP

May be you need better RAM / System.

Don't have proper answer but you can check enclosed document to get this information.

its_anandrjs

Use QVD for loading data into the qlikview which is much faster then any other raw source try to make QVD and then load data into the dashboard. Also if you use better and maximum RAM will be suggested for huge data loading.

ThornOfCrowns
Specialist II
Specialist II

Have a read of Peter Crossley's reply in this thread:

http://community.qlik.com/thread/1609

Useful info to see if it's your data setup rather than the number of records that's causing the issue.

simondachstr
Luminary Alumni
Luminary Alumni

Are you trying to display fields from multiple tables? That might cause a problem if those tables are not properly linked.

ashfaq_haseeb
Champion III
Champion III

Hi,

Please look at the attachment.

Hope it help

Regards

ASHFAQ

its_anandrjs

Yes James is right if you use Complex if statements in the chart then there is problem that you get right now because when the if statement calculation going on the chart shows object out of memory. Also the data in the chart is coming from different tables and you data model is not perfect according to the rules so suggest you check the data model specially the connected link keys between keys and check its Information density between the tables.

antoniotiman
Master III
Master III

Hi,

try to increase Limit % of cache (default is 10 %).

Regards

Not applicable
Author

Hi All,

To make my question more Specific, I am adding below points:

  1. I loaded all Dimensions (Total 7 Dimensions tables) data into separate QVD files
  2. Fact Data (1 Fact Table) into another QVD file and Fact Table contains only Key column of Dimension tables and 4 measure columns.
  3. My Dashboard report contains 4-5 columns from Dimension tables and a chart which displays count by a single attribute
  4. My Dashboard report contains another Report Table (Not a Pivot table) which is having 4-5 columns from Dimensions and 4 measure columns from Fact table and these data is at granular level. There is no expression (IF Else) in Dashboard.
  5. All the Dimensions are correctly joined [Dimension (1)-(M) Fact] with Fact table.

I am facing problem only while loading the data into Report Table (Out of memory when Whole data 500 Millions loading) and for small data set 100 or 200 Millions, each refresh or selection taking 4-5 minutes to load the data.

I was monitoring through Task Manager and found that CPU (2 CPU 6 Cores, Intel® Xeon® Processor E5-2630 v2 (15M Cache, 2.60 GHz)) uses is going 100 % for 20-30 seconds and then only 4 to 5 % and RAM uses never went above 50 %.

Note: This is a new physical server build only for load test of QlikView and there is nothing running on this other that QlikView.

ARK | Intel® Xeon® Processor E5-2630 v2 (15M Cache, 2.60 GHz)