Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Dear Qlikview Experts,
I had been stuck here for quite some time. Basically I have used a crosstable load in my script, then I realized the number of rows before and after crosstable load is different!
So before loading into Qlikview, I have 16 rows of data contained in txt file.
But after loading into Qlikview using crosstable, I have 92 rows of data in data model. My loading script is pasted below:
So my question is:
How can I count the total number of rows if all 4 fields (@12, @13, @14 & @15) contained null value over the total number of rows before loading into Qlikview? e.g. 2/16
Is there any way that I can achieve this outcome?
Attached sample qvw file and txt file for your reference.
Thank you for your time in advance!
Best Regards
QianNing
Dear Belloum,
I tested both of your expression syntax but it does not work.
For the number of null rows I should get 2 in return but it gives me ' 1 '.
For the number of rows when any fields contained data I should get 14 but it gives me ' - '.
Best Regards
Qianning
how did you test the value of that field ?
Dear Belloum,
I had just created a text object to test the expressions, but the number it returns is wrong
Best Regards
QianNing
I'm sure there is a misunderstanding here.
this:
=if(len(trim(@12))>0 OR len(trim(@13))>0 OR len(trim(@14))>0 OR len(trim(@15))>0, 1) as null_rows
you must put it on the script.
to test it on a textbox, you should do this:
=Sum(null_rows)
Dear Belloum,
Thank you for your clarification. I understood that putting it into the loading script will helps in getting the desired fixed data from the table. But i also hope it will changes according to the filtering in data model. That being said, if I just load @12, @13, @14 and @15 into my data model, then I create a lisbox for my date field, when I Select a different date, the number of null rows should change accordingly.
Best Regards
QianNing
Dear Rodell,
Thank you for your kind suggestion, it might works for this example, but next time when dealing with large amount of log files, I am worried that there will be multiple unique data rows generated on the same timing.
Best Regards,
QianNing