Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hey Guys,
We are having a very huge data set (more than 200 GB) which also needs to be incrementally updated after the bulk update. I have loaded a little bit and am already running into performance issues.(I have tried creating QVDs and direct DB loads both and I have already reduced the number of columns that are being extracted) Can some one please suggest what can be done to implement this. The stakeholders actuallly want to extract 3-4 years of data from the dashboards as well. I am pretty new to Qlikview and have not handled this much amount of data in qlik dashboards before. Any suggestions on what process I can try?
Are there options to archive the data and only load it on the dashboard when someone is trying to extract a specific data set? But even if I keep the minimum the data set for a year would also be pretty huge.(20 million rows a month) Thanks.
Thanks Marcus. This does make sense. Currently I am using a bit smaller data set for testing(30 - 40 GB). I am currently loading the QVDs in incrementally but when those QVDs are loaded into the dashboard they load the whole data set everytime. Which is why it takes so much time to load that data set.
How much time is needed and what's your expectation how long it should take?
- Marcus
We are refreshing the tables every 15 minutes in the DB so we were planning to implement the same on the dashboard but loading 200 GB of data everytime takes ages. But if we could do that in some incremental way it might become efficient.
On the bottom of my link from above are further links to more advanced topics in creating datamodels - and there you will find various explanations and examples about incremental loadings and also keeping them optimized with exists().
- Marcus
That was very helpful. Ill read those in the next few days and will give it a try!!!
Hi,
Try to using incremental load , it takes time for first all data loading then it will will load only updated data
see following , it's my be hep to you.
Let v_QVDPath="your qvd file path";
Set vQVD_File = '$(v_QVDPath)/filename.qvd';
Let vQVD_File_exist=If(FileSize('$(vQVD_File)')>0,-1,0);
If $(vQVD_File_exist) Then
MaxDate:
Load
Max(Datefiled) as MaxDate
From $(vQVD_File) (qvd);
Let vIncExpression= Chr(39) & peek('MaxDate') & Chr(39);
Drop Table MaxDate;
Else
Let vIncExpression=Chr(39) & Chr(39);
EndIf
If $(vQVD_File_exist) then
TableName:
Load *
FROM DBSOURCE
Where Datefiled >= $(vIncExpression);
Concatenate
LOAD *
FROM [$(v_QVDPath)/filename.qvd] (qvd)
where Datefiled <> $(vIncExpression);
ELSE
TableName:
Load *
FROM DBSOURCE;
EndIf
STORE TableName into [$(v_QVDPath)/filename.qvd] (qvd);
Drop Table TableName;
Thanks,
Nitin.
Hi Nitin,
Thanks for your reply. The code that you have provided, is that for the incremental load of QVD? Thanks.
Yes,
It is an incremental load
Thanks,
Nitin.
Hey Nitin,
I have already implemented the QVD incremental load. But wanted to do an incremental load on the dashboard from the QVD rather than loading the whole QVD at once. Thanks.
My be it's not possible in dashboard, you need to load all data.