I have found a limitation in Qlik Sense (didn't check in QlikView) in a string of a Field name.
Please see "QV script - Fails" as attachment. It fails when try to load "Field149"
If remove "Field149" from script rows 3,9 it will work properly (see file "QV script - OK")
I need to generate string for further insert into SQL. Currently I didn't find another way how to create the same result.
Any help will be much appreciated!
I dont think this is the field149 creating the issues. If you remove a few characters in Concat function the script will work. This might be because of any limit on the script function character. I would love to know more on this.
I hope @rwunderlich will be able to guide us on this.
This problem reminds me some of the limitations that we have in the ODAG apps when we try to pass too many values to a variable using LET. I don't know this limit but maybe this ODAG page can help you to find a solution:
By the way, at the end what are you trying to achieve?
Currently I don't have Sense available and couldn't therefore check for any possible limitations. In View in worked without any problem.
But I think you could change the variable-content to a real string instead of an expression which is executed from the $-sign expansion. This means the variable-content might be look like:
This will simplify the approach as well as reducing the length of the string and might therefore uplift the treshold for the number of chars. You could create this structure as well as with a concat-aggregation or maybe also with an interrecord-function within a preceeding load like:
load peek('Values') & Field as Values;
load 'Field' & recno() & if(recno() < 149, chr(44)) as Field autogenerate 149;
LET vValues = Peek('Values',-1,'Temp_1');
Str: noconcatenate Load $(vValues) Resident Data;
drop tables Data;
Most likely it is chr() limitation. I loaded successfully 10 000 simply combined '1' values e.g. '1,'&'1,'...
But string with chr(39) fails on approximately 5000 length.
This might be the cause and I'm not surprised because there are various similar limitations to the number of (nested) functions. For example within the View release 11 there is a restriction of 99 if-loops within a single expression and also in the current releases of View and Sense are such limitations (maybe the number of them may have changed) intentionally enabled.
Thanks for provided url but it generates different result.
It combines all values from one field with a defined chr(). In other words it works with columns.
LOAD * INLINE
For my use case values should be combined from all fields per each row into one string delimited by quotes and comma.
So it should work with rows.
LOAD * INLINE
[ Field1, Field2, Field3
BOS1, BOS2, BOS3
JFK1, JFK2, JFK3
ORD1, ORD2, ORD3
As I mentioned in my first post it should be passed into SQL INSERT statement as VALUES().
INSERT INTO [dbo].[Table1]
([Field1], [Field2], [Field3])
('BOS'1,'JFK’1,'JFK1'), ('BOS'2,'JFK’2,'JFK2'), ('BOS'3,'JFK’3,'JFK3')
I have already found a solution where split string into two parts. Please see attachment ("QS script updated" and "QS script result").
Thanks for suggestions. In my previous post I found a solution but not sure it is the best one.
At least works fine for this particular case.
I suppose this is limitation in latest releases but I would like to know where exactly it started to appear.
Could you please advise who is right person I may ask for?
AFAIK there is no public documentation about such limitations. Maybe Henric Cronström could shed some light to it.
Running in such limitations is a strong indicator that the used logic has more or less weaknesses and a rethink of the whole task is often better as looking for any ways to bypass it. This means knowing the exact theshold is rather not very useful.
Like in my other answers suggested approach is the creation of real strings probably better and your alternatives to the variable expression-content. Also there might be other (loop) approaches easier to create such strings.
Beside this I could imagine that creating a csv-output of these data which is then imported through your database might be much more convenient as creating load/sql-statements on the fly.