Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Data loss when loading from local file

Hi,

I am trying to read in a flat file with 210K rows and 45 columns. Somehow I only get in about 117K rows (and all columns). My file is Unicode format (UTF-8) and the load script is as follows:

Table:

LOAD *

FROM

FILE.TXT(txt, utf8, embedded labels, delimiter is '|', msq, header is 7 lines, filters(

Remove(Row, Pos(Top, 2)),

Remove(Col, Pos(Top, 1))

));

I do a transformation of the data removing couple of extra rows and columns and selecting the header is on 7th row. However, I have also tried the load using an edited file, where this transformation is not necessary, the same end result.

Is there possibly a limitation to maximum rows in the QlikView load script, that I haven't figured out?

1 Solution

Accepted Solutions
stephencredmond
Luminary Alumni
Luminary Alumni

Hi,

Not a limitation, it is just a problem with the encoding of the file and the quoting used.  Try changing from msq to standard.

Regards,


Stephen

View solution in original post

4 Replies
Not applicable
Author

EDIT: Solved, I tried this over and over again and changed the Quoting setting from MSQ -> Standard. Somehow that caused the loss of data...

stephencredmond
Luminary Alumni
Luminary Alumni

Hi,

Not a limitation, it is just a problem with the encoding of the file and the quoting used.  Try changing from msq to standard.

Regards,


Stephen

stephencredmond
Luminary Alumni
Luminary Alumni

Glad you found it.

BTW - this is not a "loss" of data.  It just gets to a certain point where the msq quoting indicates and end of file.  So you end up only getting the first XXX lines.  It just doesn't get to the end of the file.

Regards,

Stephen

kji
Employee
Employee

The cause is generally an unbalanced quote charater on a line making the rest of the file being treated as one record, MSQ supports multiline data by this method.