Is there a limit to the Length of a given variable in QlikView 8.5?
Long story short, I'm using QlikView 8.5 for a client to create a simluated P&L statement.
I'm using a Pick-Match setup, and populating the MATCH part with 40+ different choices. To make the code reusable, instead of this:
PICK(MATCH([Metric ID], 1, 2, 3...39, 40), SUM(1), MAX(2), AVG(3)...SUM(39), SUM(40))
I'm doing this:
PICK(MATCH([Metric ID], $(MetricIDList)), $(MetricSyntaxList))
The problem is, my $(MetricSyntaxList) variable is now 5,200+ characters, and no longer works. I know my syntax is correct, because it works if it's under 5,200 characters and breaks if it's over 5,200 characters.
Thanks in advance.
I don't think there is supposed to be a limit. That said, I believe I've run into expression length limits before when specifically trying to test that. When long enough, I've had valid expressions crash out the entire application.
Unfortunately, I don't know of a good workaround for what you seem to be trying to do.
Maybe if you need this in a chart type that allows hidden expressions, you could check for a limited number of metrics in several different pick(match()) expressions. Then create a final expression that picks the first non-null value from the previous results. That's assuming the limit is related to one expression or variable, and not to the total across several. Since the limit is likely a bug, it's very hard to guess what might and might not work.
Hi Corey, I had the same problem in a big customer with big complex expression in version 8.5, unfortunately we could not find a good way the solve it, but to migrate to version 9. I guess you can have a good chance using hidden expressions as John pointed out.
we are having the same problem with Qlikview 10 SR3.
We have some very long formulas in a dozen of variables. One table incorporates all the calculations, so the table expression has a number of IFs that decide which variable to display.
At a certain point the table stops evaluating the variables.
We consider this a bug since there are no explicit indications by Qliktech about the variable length limit.
What is autocommenting expressions?
I need it because I have many calculations to put in variables with pick match.
I try to do a time and metric dimension that the user can choose.
This is a rather old topic, but I suspect you can autocomment expressions by adding variables.
First set variable, name it vCommentEnabled and set definition to:
Then add a text object to the sheet with expression:
='test 1 ' & $(=$(vCommentEnabled(0))) 'test2'
And add another text object to the sheet with expression:
='test 1 ' & $(=$(vCommentEnabled(1))) 'test2'
You'll notice that based on the parameter 1/0 the latter of the expression is automatically commented out. That is, the 'test2' bit of the expression can be dynamically commented, depending on the output of the variable vCommentEnabled acting as a user defined function.
I have same question, Imagine that I have a Sql query that is loading big list of Item# From Table 1.
In next part of script, I have another query that I don't want to join with previous one but I want to load another Sql From Table2 . In second SQL I need load only Item# which I already loaded in step 1.
I can use exists but for that, first I have to load all of items in SQL and then in load script I can use exists to load only existing item#s.
Another way is to concatenate list of Item# in step 1 and store it in a variable then use this variable in where condition in next SQL query. (here we are not loading all of item numbers in SQL so i think it would be faster)
in second scenario, I don't know if there is any limitation in variable length or not(list of item# might be half million )
Thanks a lot
I think your case is quite different to the origin question because the limitation will be within the capabilities / settings of your database / driver. Most of them will have a limit of around 1000 items within an in() and I doubt that they would support a significantly larger concat-string with OR statements.
The usual suggestion for such cases is to use the filter-result within an inner-join approach. If both datasets come from a single db it's quite easy - if not you could output the filter-result in a csv-file and load it (temporary) into the other db.
Another approach might be to remain by your first mentioned method and filtering the data with an exists() clause - by implementing it within an incremental logic. This may not be the most performant solution but if it's fit within your timeframe for this task it will be sufficient.