Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I have a large text file that I am trying to load into a qvd. Each row in the text file can have up to 1025 characters and the file has over 1 million records. Loading the file works MOST of the time. If I load the SAME file over and over again (after deleting the generated qvd file), eventually, it will mess up the layout of some of the records either by splitting them or even at times displays weird characters with accents and basically corrupts the QVD file. It does not show any errors during load.
I am using the following code:
MyTable:
LOAD
@1:n AS FullRecord
FROM
[MyFile]
(ansi, fix, codepage is 1252);
STORE MyTable INTO [MyTable.qvd](qvd);
Has anyone encountered anything like this? Could it be memory related? Please note that I am using the same file to reload and I do delete the previously generated qvd file. I have also tried to run this on another machine with QV Developer and it works great all the time. Any help will be much appreciated.
Thanks,
Sam
Hi.
Interesting. Have you checked the RAM of your machine with some stability tests ?
You can also use some archive file utilities to do this in the same way.
Hi,
I haven't checked the RAM yet. I tried to mess with the caching options with Qlikview but nothing helped. My other thought was maybe because we have the QV Developer and the QV Server on the same VM. Maybe a memory conflict? I will run some tests on the RAM and let you know. Thanks for the idea.
Sam
I went ahead and requested the memory check and there were no issues. I am not sure what else to check. I was thinking maybe there were some foreign characters in the text but it doesn't explain how it is working on the other server. Anyone have any other ideas?
Few suggestions:
Hi Ajay,
Actually, it is my machine that is loading fine and I only have 4GB. It is our server that is having this issue and it has 16GB RAM. I have never seen anything like this before. I know QV loads data in UTF8 but could this happen if the data has international characters?
Sam
Same problem here in two different machines.
Have you found a solution?
Claudio
Did you create your loading script using a QV vizard? Is the code above is your actual script?
I've noticed a similar problem a while ago (missing records while loading data from large TXT file).
In my case the source text file contains multiple columns, but number of records was pretty big.
The loading script was like below
LOAD
.....
FROM
[..\My Documents\Development\Book3.txt]
(txt, unicode, embedded labels, delimiter is '\t', msq);
And I was able to resolve the issue by removing the [, msq] portion of the format-specific string
Just a thought.....
For me the problem was resolved with the new release. After install R7 everything work fine.