Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
GabrielOtet
Contributor III
Contributor III

Time of data process to be reduced

Dear all,

 

I want to reduce the ammount of time needed to process and show a CDF table, but until now it is not working (there is a high ammount of data to be calculated). I have the feeling that the formula below can be optimised in order to get faster the results but I don't know how.

Measures

RangeSum(Above(Count({<  here are the filters  >} Dimension ),0,RowNo(TOTAL)))
/
Count({<  here are the filters  >} TOTAL Dimension )

CDF_QLIK.PNG

Did anyone had the same problem and solved it? Maybe the RangeSUM should be changed with something else? Or there is another way of showing the graph.

I have the feeling that because of the Total parameter, the data is checked entirely, independent from the filters but without it, the graph is not showing the right values.

 

Best regards,

Gabriel

 

1 Solution

Accepted Solutions
GabrielOtet
Contributor III
Contributor III
Author

Hi Marcus,

 

Thank you for the answer.

I have used your advice of grouping the data and indeed, the situation improved drastically.

What I did (for others who are searching maybe for the same answer:

1. As there was a big ammount of points to be ploted, I have searched a way to group them without affecting the general visual properties of data

2. The function used is: Class(point, 0.01) - I have implemented this function for Dimension part, keeping the Measures untouched.

There is a not noticeable visual difference which can be spoted only when zooming in. 

View solution in original post

2 Replies
marcus_sommer

The essential part of your logic is an interrecord-function which are in general quite resource-consuming because they don't calculate a single-value for a dimension-value else n of them. By small dimensions like 12 months it's often not really noticable but if there are many thousands the number of calculations and the needed RAM increase quite heavily.

This means a reduction of the available dimension-values would be useful to reduce the processing-time. For it you could create a second dimension-field with some kind of grouping of your data. In your case it might be some rounding maybe to integers instead of using float-numbers. Quite often are such adjustments not visible within the chart respectively it don't really change the value of the information/view.

Beside this you might save some resources if you put: Count({<  here are the filters  >} TOTAL Dimension ) into a variable and used the variable-result as divisor.

- Marcus

GabrielOtet
Contributor III
Contributor III
Author

Hi Marcus,

 

Thank you for the answer.

I have used your advice of grouping the data and indeed, the situation improved drastically.

What I did (for others who are searching maybe for the same answer:

1. As there was a big ammount of points to be ploted, I have searched a way to group them without affecting the general visual properties of data

2. The function used is: Class(point, 0.01) - I have implemented this function for Dimension part, keeping the Measures untouched.

There is a not noticeable visual difference which can be spoted only when zooming in.