Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I currently got a process that updates some records on a pretty big dataset.
The set is around 20 million records.
The current process loads the updated records, and then the rest of the records on the data set, using a where not exists.
I was wondering if any of you guys know a better and faster way of achieving this.
thanks for the support,
KR,
Mario
Hola Mario,
Puedes exportar el archivo a txt?
Qlikview tiene una excelente compresión usando txt yo cargo cerca de 18 MM de datos y durá menos de 2 min y el archivo pesa 60 MB.
Saludos,
--
Hi Mario
it's Possible to export the files to txt?
Qlikview has an excellent compression using a txt files.
Kind Regards,
It's generally the only way to update a QVD. If it takes too long, one approach is to segment the QVD into multiple files if possible so you are only updating a smaller file.
-Rob
Make sure that your qvd-loadings are optimized - this meant the where exists(FIELD) with only one parameter and no further transformations are applied within the load. Even by 20 M records it should be quite fast and a matter of seconds within a modern environment.
- Marcus