Skip to main content

QlikView App Dev

Discussion Board for collaboration related to QlikView App Development.

Action-Packed Learning Awaits! QlikWorld 2023. April 17 - 20 in Las Vegas: REGISTER NOW
Showing results for 
Search instead for 
Did you mean: 
Contributor II
Contributor II

Very Big FACT in Data Model

We are developing Data Model in Qlikview. Also we are achieving Star Schema Model.

We have around 45 Million Records in Fact Table which results into 725 MB QVD, as a result we are facing performance issue.

When we filter data bring record set to 4 Million and qvd size to 250 MB, things works better for us.

Is there anyway to deal with this scenario.

-- Shree Angane

5 Replies

Hi Shree,

yes and no - there is probably no way to solve your performance issue - IT-specific actions aside - than to just get your fact_table smaller which can basically only be done by either reducing the nr. of fields you are loading (find out about unused fields using the DocumentAnalyzer (a qvw available here in the Community))

or by reducing the timeline (e.g. loading data for only the current year)


Best regards,




I think data upto 4 Million and qvd size to 250 MB is not create any big issue.

Just check that you are not using performance affecting function like if()

Avoid IF()


Great dreamer's dreams never fulfilled, they are always transcended.
Contributor II
Contributor II

Thanks for your Reply.

Surely I will go through "DocumentAnalyzer", need to figure out some way to it.

Best Regards,

Shree Angane

Partner - Specialist II
Partner - Specialist II


Instead of reducing rows, try reducing columns. Check which columns can be disregarded, because besides the number of records, another thing that affects the size and performance of your data are the columns.

Also if you're experiencing perfomance issues after doing some optimizations (working in a 3-4 data tier, using optimized qvds, avoid using nested ifs in expressions, avoiding using calculated dimensions and instead precalculating as much as possible in script, etc) then next solution would be to increase resources in your server (adding more ram for example),


Partner - Creator III
Partner - Creator III


There are a few things you might want to check in order to help you optimize your document:

  1. Eliminate unnecessary fields with unique values such as unused keys or time-stamps
  2. If time-stamps are absolutely necessary, split them in several fields in order to avoid large indexes on those fields. The more different values you have in a field, the bigger the size of the index, hence, the bigger your QlikView document. You may opt to break them down to date, hour, minutes, seconds, milliseconds. Often enough, you will realize at this point that milliseconds and seconds are quite often not necessary.
  3. Add counter fields instead of using Count(DISTINCT ...). By adding a field with 1 as value to your table, you may use the Sum() function instead, which is a lot faster.