Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello everyone
I pretty new to QlikView and trying to get my head around the different data structures and how to combine and manipulate my data.
In the load script I have loaded table with a number of rows, and through some loops I have created an array* with the same number of items. I would now like to merge the table and the array in order to create a table, where the first item in the array is appended to the first row of the table.
I don't know if it is possible to incrementally add values to a new column to loaded table as the loop runs, but suggestions in that direction are welcome too.
Regards
Rasmus
*The array is created like this in each iteration of the loop:
let array[$(k)] = variable_name
where k is the counter in the for loop.
Hi,
instead of creating multiple variables you could create a temporary table storing your variable_name values in the order matching your table rows. Afterwards you would join this temporary table to your initial one based on a common ID or row number field. Finally you would drop the temporary table.
One example to do so could be:
table:
LOAD RowNo() as RowNo,
'SomeValue'&RowNo() as SomeValue
AutoGenerate 10;
FOR k = 1 to 10
LET variable_name = 'VariableValue'&k;
tabArray:
LOAD RowNo() as RowNo,
'$(variable_name)' as ArrayField
AutoGenerate 1;
NEXT k
Join (table)
LOAD * Resident tabArray;
DROP Table tabArray;
But depending on how your variable_name receives its values, the loop might not be necessary either.
hope this helps
regards
Marco
I'm not sure that you need such kind of loops and arrays. Please post how your origin table look like and how it should be after the transformation.
- Marcus
It may be simpler to post an example of your source data and details of what you are trying to achieve.
As marcus_sommer has said, it is unlikely that you need to process the data through a manual array structure.
I would suggest you look at some posts regarding the data model for QlikView to see the best approach for your data.
Hi,
instead of creating multiple variables you could create a temporary table storing your variable_name values in the order matching your table rows. Afterwards you would join this temporary table to your initial one based on a common ID or row number field. Finally you would drop the temporary table.
One example to do so could be:
table:
LOAD RowNo() as RowNo,
'SomeValue'&RowNo() as SomeValue
AutoGenerate 10;
FOR k = 1 to 10
LET variable_name = 'VariableValue'&k;
tabArray:
LOAD RowNo() as RowNo,
'$(variable_name)' as ArrayField
AutoGenerate 1;
NEXT k
Join (table)
LOAD * Resident tabArray;
DROP Table tabArray;
But depending on how your variable_name receives its values, the loop might not be necessary either.
hope this helps
regards
Marco
Thanks for the replys!
I have yet to try our your suggestions, but I'm posting my script so far for reference.
Basically, I have a table that contains a number of cases, and I have a table with search terms, that I use to classify my cases.
The loops should for each case test if the criteria in row 1 of the classification table is met, if true, then assign the main category to the case, and then move on to the next case. If the criteria is not met, it should move on to test for the criteria in row 2 and so forth.
Since the classification scheme is complex (and needs to be migrateable), I don't want to hard code it, although this is technically possible)
Hope this helps you understand. As you say, the array may not be needed at all, if I can assign to input_data during the loop.
Regards
Rasmus
Search_terms: //classification table
LOAD MAIN_CATEGORY,
FIELD1,
SEARCH_TERM1,
FIELD2,
SEARCH_TERM2,
FIELD3,
SEARCH_TERM3,
FIELD4,
EXCLUDE_TERM4
FROM
[...]
(ooxml, embedded labels);
Input_data:
LOAD Code,
ID,
Main_text,
Classification
FROM
[...]
(ooxml, embedded labels);
LET NumSearches = NoOfRows('Search_terms');
Let NumCases = NoOfRows('Input_data');
LET cat_array = "";
For k = 0 to $(NumCases);
For i=0 to $(NumSearches);
Let s1 = Peek('SEARCH_TERM1', $(i), 'Search_terms');
Let f1 = Peek('FIELD1', $(i), 'Search_terms');
Let s2 = Peek('SEARCH_TERM2', $(i), 'Search_terms');
Let f2 = Peek('FIELD2', $(i), 'Search_terms');
Let s3 = Peek('SEARCH_TERM3', $(i), 'Search_terms');
Let f3 = Peek('FIELD3', $(i), 'Search_terms');
Let e = Peek('EXCLUDE_TERM4', $(i), 'Search_terms');
Let f4 = Peek('FIELD4', $(i), 'Search_terms');
Let mcat = Peek('MAIN_CATEGORY', $(i), 'Search_terms');
Let main = lower(Peek('Main_text', $(k), 'Input_data'));
IF IsNull(s2) THEN
IF SubStringCount(main, s1) >= 1 THEN
Let cat_array[$(k)] = mcat;
EXIT FOR
ENDIF;
ELSEIF IsNull(s3) THEN
IF SubStringCount(main, s1) >= 1 and SubStringCount(main, s2) >= 1 THEN
Let cat_array[$(k)] = mcat;
EXIT FOR;
ENDIF;
ELSEIF not IsNull(s3) THEN
IF SubStringCount(main, s1) >= 1 and SubStringCount(main, s2) >=1 and SubStringCount(main, s3) >= 1 THEN
Let cat_array[$(k)] = mcat;
EXIT FOR;
ENDIF;
ENDIF;
Next;
Next
I would go at first with a mapping and mapsubstring() maybe with something like this:
map1:
mapping LOAD SEARCH_TERM1, '>' & MAIN_CATEGORY & '<' FROM [...] (ooxml, embedded labels);
map2:
mapping LOAD SEARCH_TERM2, '>' & MAIN_CATEGORY & '<' FROM [...] (ooxml, embedded labels);
map3:
mapping LOAD SEARCH_TERM3, '>' & MAIN_CATEGORY & '<' FROM [...] (ooxml, embedded labels);
Input_data:
LOAD
Code, ID, Main_text, Classification,
textbetween(mapsubstring('map1', Main_text), '>', '<') as MAIN_CATEGORY1,
textbetween(mapsubstring('map2', Main_text), '>', '<') as MAIN_CATEGORY2,
textbetween(mapsubstring('map3', Main_text), '>', '<') as MAIN_CATEGORY3
FROM [...] (ooxml, embedded labels);
Maybe you need some adjustments and further checking or filtering on this load-result. But I could imagine that this approach could work.
An alternatively to this approach could be to join both tables to get a cartesian product of them and then using if(wildmatch(), 1, null()) maybe within a rangecount() to check if there are any matches (and which one if it's needed) and using this result to filter the table again.
- Marcus
This worked (slightly modified) - thanks!
Now just need to see if I can improve performance