Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

How to increase the temporal depth of the reports

hi all, we need a suggestion in order to optimize our System.

The goal to increase the temporal depth of our reports without adding RAM or loose performance

In this moment the data in our .QVD are limited at 380 days, meaning that we're able to generate report since a year ago.

We'd like to aggregate the data (e.g. generate some new .QVD) in wich store the "transformed" data insted of the "gross" data.

in order to do this we thought to create a new ETL proccess that instead of read from the DB, read from the oldest data from QVD

Transform these data and store the data in a new "historical QVD". and very last delete these old data from the source .QVD.  

We know that in this way we'll be unable to generate new report but this should allow us to increment the deep of our report,

but we'd like to know if in your opinion generate these "partial/aggregate .QVD" will permit to reach our goal.

3 Replies
ramasaisaksoft

Hi Christian,

i am sorry to say ur question is not saying what u want exactly?

i hope u r asking only idea.

as per my understanding

option 1: Archive the data as per your requirement.(You said less than 1 year data so i suggest some more less, else if the data is huge then better to use other tools like SAP,Cognos..etc).

Option 2:- in database level only join fields (means concatenate necessary fields like First name,Last  name,Middle name as Name so you won't get null values and it won't create any data issue )

Option 3:- try to drop the unnecessary fields, tables at all levels and also try to remove variables also.

Sergey_Shuklin
Specialist
Specialist

Hello, Christian!

There is one more way for optimization - create several reports which will show defferent slices of data and tie them with links. User won't notice that trick if reports visual part will be the same.

passionate
Specialist
Specialist

Hi Christian,

There is one best practice to handle large data sets

You can create multiple reports and then use Document Chaining.

This will improve performance and even slice data as required.

Regards,

Pankaj