Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Limit dimension in table considering filters and variables

Hi,

in my application I have a Table. I want the Dimension of the table to consider the current selection and some variable values. The formula for the Dimension is as follows:

If( [Field1] > vAriable1 and [Field2] < vAriable2, [ID])

So this gives me all the [ID]'s as Dimension for the table I want, considering the current selection and variable values.

My Problem is, that the Table might contain up to 5Mio datasets. so that if the current selection and variable values are not wisely chosen the application keeps loading and loading and the RAM is overflowing.

To avoid that, I tried to use the getPossibleCount()-function to limit the amount of datasets to 100.000, however the function only considers the current selection and not the variable values:

IF(GetPossibleCount([ID])>100000,'To many datasets', If([Field1] > vAriable1 and [Field2] < vAriable2, ID))

Is there a way to Limit the amount of datasets for the table which considers the current selection and varaible values?

I thought of something like this:

If( Count( If(  [Field1] > vAriable1 and [Field2] < vAriable2, [ID])) > 100.000, 'Too many datasets', [ID])

Thanks in advance

1 Solution

Accepted Solutions
shubham_singh
Partner - Creator II
Partner - Creator II

Hi,

I created a sample app.

To solve your problem I created a variable with expression

=count(aggr(only({<Sales={">=$(x)"},Revenue={">$(y)"}>}[Customer Name]),[Customer Name]))

Then use below expression in dimension

=

if($(vTest)<100,

aggr(only({<Sales={">=$(x)"},Revenue={">$(y)"}>}[Customer Name]),[Customer Name])

)

This should solve your problem.

PFA qvf for refernce

View solution in original post

17 Replies
ogster1974
Partner - Master II
Partner - Master II

Can you not put limits on the variable values?

Set Default values and min/max values to avoid the overflow situation occuring.

Anonymous
Not applicable
Author

after this condition If( [Field1] > vAriable1 and [Field2] < vAriable2, [ID])  try to keep dimention limit to 100000

shubham_singh
Partner - Creator II
Partner - Creator II

That's a easy one

Assuming you are using a straight table.

Open the expression of ID dimension and replace it with

aggr(only({<[Field1]={">$(vAriable1)"},[Field2]={"<$(vAriable2)"}>}ID),ID)

This would limit your IDs to your desired value.

Go to Add-On and open calculation condition expression, write following expression there.

GetPossibleCount([ID])<100000


Add a message saying, "Data Set too big, please filter to less than 100000"


You coud also use limit dimension.

Not applicable
Author

Actually the App's purpose is to let the user select the right data and then to export it. So the default values necessarily lead to overflow situation.

Not applicable
Author

Even if I keep the dimension limit to 1000 the App stops working. I guess the dimension limit does not determine how many datasets are calculated but how many are shown and can be exported.

Not applicable
Author

For a straight table your suggestion works and already helped me although the expression for the calculation condition expression has to be:

Count( If(  [Field1] > vAriable1 and [Field2] < vAriable2, [ID])) < 100000

I highly appreciate your support.

Unfortunately I am working with an export-extension:

http://branch.qlik.com/#!/project/57882593dd42cf18d2312a05

and it does not offer any data handling - calculation condition. Moreover to use limit dimension does not work either as I wrote in my previous reply.

So what I need would be one expression for the Dimension that counts the number of datasets regarding the filters and variable values and then based on that number decides whether to stop calculating and show an error message or to show the datasets which meet the filters and variable values. 

Something like this:

If( Count( If(  [Field1] > vAriable1 and [Field2] < vAriable2, [ID])) > 100.000,

     'Too many datasets',

     If(  [Field1] > vAriable1 and [Field2] < vAriable2, [ID]))


(Last time I forgot the second if-condition in my own suggestion, I am sorry for that)

ogster1974
Partner - Master II
Partner - Master II

Then perhaps you need to rethink what you are using the app to achive. 

for example could you create qvds that contain subsets of data you know is manageable by user selection i.e filter on data in the 2016 qvd or throw more RAM at the problem?

I don't think you will remove the bottleneck in a way acceptable to the user by this approaoch. I think you need to prepare the data in a way to support the use case. The more complex your set analysis calculation the harder the system will have to work to calculate the result.

OmarBenSalem

Hi Thorben

I think, based on your scenario, you should take a look at this; I've already used it, and it's amazing !

Qlik Sense and the App On-Demand

shubham_singh
Partner - Creator II
Partner - Creator II

Hi,

I created a sample app.

To solve your problem I created a variable with expression

=count(aggr(only({<Sales={">=$(x)"},Revenue={">$(y)"}>}[Customer Name]),[Customer Name]))

Then use below expression in dimension

=

if($(vTest)<100,

aggr(only({<Sales={">=$(x)"},Revenue={">$(y)"}>}[Customer Name]),[Customer Name])

)

This should solve your problem.

PFA qvf for refernce