Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
SK28
Creator
Creator

Problems with data model and dashboard loading due to HUGE DATA.

Hi,

 

I have a table which has Around 44 Crores (i.e 440 Million records) per month.(distinct  records  ) 

So we decide to create Qvd's by month...

 

After doing so many Joins and calculations  (Much required columns which we want to use them in  dashboard for set analysis or for list boxes etc etc)

So we can't ignore these columns (these has to be calculated in Data model itself)

I have optimised the QVD (row based optimisation)

Its taking atleast 2hrs to execute the Datamodel (sometimes even the system might get hanged , u can see black out screens )

Just to load 3 months Data..!

After all the joins and Filters required data is coming to 8 crore records (80 millions records)

Data model size is going upto 5GB

Which is making DASHBOARD TO LOAD SLOW AND EVEN GETTING IT STRUCKED...

i even tried to Break the data model into two parts (one with data cleansing)  which has raw data

Which stores in single QVD

And from that QVD (80 million records) I'm making certain Columns which are mandatory (as said above)

Still I'm facing the same issue......!!

 

IS there a way to execute this Datamodel to run fast. 

Or atleast DASHBOARD to react quickly (no hangs etc,) to respond quickly ???

Thanks

Labels (5)
1 Solution

Accepted Solutions
MindaugasBacius
Partner - Specialist III
Partner - Specialist III

There might be a lot of keys to success:

1. Aggregated data. Try using correct data granularity. Might business would be grateful for monthly, weekly data, not daily or even hourly.

2. Save space by reducing unnecessary fields, use Autonumberhash256 for keys as they tend to eat a lot of space.

3. Split fields like timestamps, names surnames, address, phone number or other long strings/numbers.

 

There is no fast way to do this. The application optimization is the key.

View solution in original post

1 Reply
MindaugasBacius
Partner - Specialist III
Partner - Specialist III

There might be a lot of keys to success:

1. Aggregated data. Try using correct data granularity. Might business would be grateful for monthly, weekly data, not daily or even hourly.

2. Save space by reducing unnecessary fields, use Autonumberhash256 for keys as they tend to eat a lot of space.

3. Split fields like timestamps, names surnames, address, phone number or other long strings/numbers.

 

There is no fast way to do this. The application optimization is the key.