Qlik Community

QlikView App Dev

Discussion Board for collaboration related to QlikView App Development.

Announcements
Now Live: Qlik Sense SaaS Simplified Authoring – Analytics Creation for Everyone: READ DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
imark
Contributor III
Contributor III

Unnesting array-like fields in QV

Assume we are importing in QV an excel table looking as follows:

ID               providers                                                             metric

101            google.com,  yahoo.co.uk, seznam.ru    10

102            comcast.com, aol.net                                    21

103           mail.de                                                                  14

And that our goal is to "unnest" the provider field ending up, in this example, with 6 rows in the final table

ID               providers                                                             metric

101            google.com                                                        10

101           yahoo.co.uk                                                        10

101           seznam.ru ´                                                        10

102          comcast.com                                                       21

...

How can we achieve this in QV? I can think of a solution using two nested loops but was wondering any more code-efficient way.

Labels (2)
2 Solutions

Accepted Solutions
jpenuliar
Partner - Specialist III
Partner - Specialist III

Hi Imark,

Try this:

Temp00:
Load * Inline [
ID,PROVIDERS,METRIC
101,'google.com, yahoo.co.uk, seznam.ru',10
102,'comcast.com, aol.net',21
103,'mail.de',14
];


NoConcatenate
Load
ID,
SubField("PROVIDERS",',',IterNo()) as "PROVIDERS",
METRIC
Resident Temp00
WHILE IterNo() <= SubStringCount("PROVIDERS",',')+1
;

DROP table Temp00;

View solution in original post

rwunderlich

There is no need for the IterNo().  Just use the two parameter SubField() and the additional rows will be created automatically:

Load
  ID,
  trim(SubField(PROVIDERS, ',')) as PROVIDERS,
  METRIC
Inline [
ID,PROVIDERS,METRIC
101,'google.com, yahoo.co.uk, seznam.ru',10
102,'comcast.com, aol.net',21
103,'mail.de',14
];

-Rob
http://masterssummit.com
http://qlikviewcookbook.com
http://www.easyqlik.com

View solution in original post

4 Replies
jpenuliar
Partner - Specialist III
Partner - Specialist III

Hi Imark,

Try this:

Temp00:
Load * Inline [
ID,PROVIDERS,METRIC
101,'google.com, yahoo.co.uk, seznam.ru',10
102,'comcast.com, aol.net',21
103,'mail.de',14
];


NoConcatenate
Load
ID,
SubField("PROVIDERS",',',IterNo()) as "PROVIDERS",
METRIC
Resident Temp00
WHILE IterNo() <= SubStringCount("PROVIDERS",',')+1
;

DROP table Temp00;
Marcos_rv
Creator II
Creator II

Try this:

 

 

tablaaaaaa:
load *,
SubStringCount (providers, ',') as count;


load * inline [
ID , providers , metric

101 , "google.com, yahoo.co.uk, seznam.ru" ,10

102 , "comcast.com, aol.net" , 21

103 , "mail.de" , 14
];


let vVar = NoOfRows('tablaaaaaa');

for i = 0 to $(vVar) - 1

LET vID = peek ('ID', $(i), 'tablaaaaaa');
let vSUB = peek( 'count', $(i), 'tablaaaaaa');
LET vPROV = peek( 'providers', $(i), 'tablaaaaaa');

for j =0 to $(vSUB)

AUX_TABLA:
LOAD
$(vID) AS ID,
SubField('$(vPROV)',',',$(j)+1) as providers
AutoGenerate 1;


next


NEXT;

NoConcatenate
final_tabla:
load
ID,
providers
Resident AUX_TABLA;

LEFT JOIN (final_tabla)
LOAD
ID,
metric
Resident
tablaaaaaa;

DROP TABLE tablaaaaaa;
DROP TABLE AUX_TABLA;

 


exit script;

 

sorry for the names I use in the variables or tables, mmmm I take this kind of queries as a game, scripting is fun. Regards!!!!

rwunderlich

There is no need for the IterNo().  Just use the two parameter SubField() and the additional rows will be created automatically:

Load
  ID,
  trim(SubField(PROVIDERS, ',')) as PROVIDERS,
  METRIC
Inline [
ID,PROVIDERS,METRIC
101,'google.com, yahoo.co.uk, seznam.ru',10
102,'comcast.com, aol.net',21
103,'mail.de',14
];

-Rob
http://masterssummit.com
http://qlikviewcookbook.com
http://www.easyqlik.com

imark
Contributor III
Contributor III
Author

Talk about a neat solution! Thank you a lot to you both