Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello !
We have a huge CSV file that needs to be loaded and further saved as a QVD.
Cause within our installation, data tables are alwayes dumped into CSV files and then transfromed into QVDs.
The real applications (QVWs) read those QVDs (and not the original database or CSV file). This is way faster to develop.
So far , so good.
Now it happens that we have a huge CSV file that cannot be loaded by QV.... What we have observed :
So, it looks pretty much with a LIMITATION of QLIK VIEW to load huge files.
Do you think you could dump the attached (linked) kit and run at your side ?
On the kit you have :
Hope you can at least simulate the problem there at your side. If it is a technical problem I need to open a support request to QlikTech.
But before doing that I wanna test with other fellows.
Thanks in advance 4 your suPPort !
Without set erromode=0 the script would actually stop at the first error.
-alex
I have been running it for quite a few minutes and it loads very slowly because of all the transformation (IF...).
Try loading it straight into memory and then do the transforms from the resident file e.g.
tFCRES:
LOAD * FROM FCRES.CSV;
FCRES:
LOAD..IF(...)
RESIDENT tFCRES;
DROP TABLE tFCRES;
Hope this helps!
Gordon
Have a 32 bit laptop with QV10. Original script gives out of memory, so I did:
All files load just fine. It means you need to move to a machine with bigger memory, and possibly 64 bits.
When you removed columns, it equally reduced the memory consumption.
-----------
And rewrite those copy pasted code chunks, those are a mess ..
-Alex
This script is not meant to be "human readable" ..
The script is generated automatically by a Cobol Program. Thus, the Cobol program running on a Mainframe DUMPs a table onto the CSV and then creates the script that will transform the CSV into a QVD.
The IF statements are there to detect invalid content on the fields.
But let's listen to other opinions...