Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
selkhoury
Partner - Contributor
Partner - Contributor

Issue loading a large text file to QVD

Hello,

I have a large text file that I am trying to load into a qvd.  Each row in the text file can have up to 1025 characters and the file has over 1 million records.  Loading the file works MOST of the time.  If I load the SAME file over and over again (after deleting the generated qvd file), eventually, it will mess up the layout of some of the records either by splitting them or even at times displays weird characters with accents and basically corrupts the QVD file.  It does not show any errors during load.   

I am using the following code:

MyTable:

LOAD

  @1:n AS FullRecord

  FROM

[MyFile]

(ansi, fix, codepage is 1252);

STORE MyTable INTO [MyTable.qvd](qvd);

Has anyone encountered anything like this?  Could it be memory related?  Please note that I am using the same file to reload and I do delete the previously generated qvd file.  I have also tried to run this on another machine with QV Developer and it works great all the time.  Any help will be much appreciated.

Thanks,

Sam

8 Replies
whiteline
Master II
Master II

Hi.

Interesting. Have you checked the RAM of your machine with some stability tests ?

You can also use some archive file utilities to do this in the same way.

selkhoury
Partner - Contributor
Partner - Contributor
Author

Hi,

I haven't checked the RAM yet. I tried to mess with the caching options with Qlikview but nothing helped. My other thought was maybe because we have the QV Developer and the QV Server on the same VM. Maybe a memory conflict? I will run some tests on the RAM and let you know. Thanks for the idea.

Sam

selkhoury
Partner - Contributor
Partner - Contributor
Author

I went ahead and requested the memory check and there were no issues.  I am not sure what else to check.  I was thinking maybe there were some foreign characters in the text but it doesn't explain how it is working on the other server.  Anyone have any other ideas?

Not applicable

Few suggestions:

  • Try Load Distinct as this will greatly reduce the data size when storing by removing duplicates if any
  • From what you explained it seems like it works fine on server but not on your machine. I cant imagine this happening as, if you are out of resources (RAM/CPU) you would get an error message 'Out of Virtual memory'. Since it reloads fine I dont think it can corrupt the qvd. How did you confirm it(I kinda doubt it)
  • Maybe the data itself is like that.
  • Compare the qvds generated from the two different machines and see if they match
  • Maybe you can run it in another machine and try comparing all the three different qvds
selkhoury
Partner - Contributor
Partner - Contributor
Author

Hi Ajay,

Actually, it is my machine that is loading fine and I only have 4GB.  It is our server that is having this issue and it has 16GB RAM.  I have never seen anything like this before.  I know QV loads data in UTF8 but could this happen if the data has international characters?

Sam

Not applicable

Same problem here in two different machines.

Have you found a solution?

Claudio

vlad_komarov
Partner - Specialist III
Partner - Specialist III

Did you create your loading script using a QV vizard? Is the code above is your actual script?

I've noticed a similar problem a while ago (missing records while loading data from large TXT file).

In my case the source text file contains multiple columns, but number of records was pretty big.

The loading script was like below
LOAD

.....
FROM

[..\My Documents\Development\Book3.txt]
(txt, unicode, embedded labels, delimiter is '\t', msq);

And I was able to resolve the issue by removing the [, msq] portion of the format-specific string

Just a thought.....

Not applicable

For me the problem was resolved with the new release. After install R7 everything work fine.