Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Very Big FACT in Data Model

We are developing Data Model in Qlikview. Also we are achieving Star Schema Model.

We have around 45 Million Records in Fact Table which results into 725 MB QVD, as a result we are facing performance issue.

When we filter data bring record set to 4 Million and qvd size to 250 MB, things works better for us.

Is there anyway to deal with this scenario.

-- Shree Angane

5 Replies
datanibbler
Champion
Champion

Hi Shree,

yes and no - there is probably no way to solve your performance issue - IT-specific actions aside - than to just get your fact_table smaller which can basically only be done by either reducing the nr. of fields you are loading (find out about unused fields using the DocumentAnalyzer (a qvw available here in the Community))

or by reducing the timeline (e.g. loading data for only the current year)

HTH

Best regards,

DataNibbler

PrashantSangle

Hi,

I think data upto 4 Million and qvd size to 250 MB is not create any big issue.

Just check that you are not using performance affecting function like if()

Avoid IF()

Regards,

Great dreamer's dreams never fulfilled, they are always transcended.
Please appreciate our Qlik community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂
Anonymous
Not applicable
Author

Thanks for your Reply.

Surely I will go through "DocumentAnalyzer", need to figure out some way to it.

Best Regards,

Shree Angane

jaimeaguilar
Partner - Specialist II
Partner - Specialist II

Hi,

Instead of reducing rows, try reducing columns. Check which columns can be disregarded, because besides the number of records, another thing that affects the size and performance of your data are the columns.

Also if you're experiencing perfomance issues after doing some optimizations (working in a 3-4 data tier, using optimized qvds, avoid using nested ifs in expressions, avoiding using calculated dimensions and instead precalculating as much as possible in script, etc) then next solution would be to increase resources in your server (adding more ram for example),

regards

pgrenier
Partner - Creator III
Partner - Creator III

Hello,

There are a few things you might want to check in order to help you optimize your document:

  1. Eliminate unnecessary fields with unique values such as unused keys or time-stamps
  2. If time-stamps are absolutely necessary, split them in several fields in order to avoid large indexes on those fields. The more different values you have in a field, the bigger the size of the index, hence, the bigger your QlikView document. You may opt to break them down to date, hour, minutes, seconds, milliseconds. Often enough, you will realize at this point that milliseconds and seconds are quite often not necessary.
  3. Add counter fields instead of using Count(DISTINCT ...). By adding a field with 1 as value to your table, you may use the Sum() function instead, which is a lot faster.

Regards,

Philippe