Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
mariozubieta
Contributor III
Contributor III

Update Huge DataSet

Hello,

I currently got a process that updates some records on a pretty big dataset.

The set is around 20 million records.

The current process loads the updated records, and then the rest of the records on the data set, using a where not exists.

I was wondering if any of you guys know a better and faster way of achieving this.

thanks for the support,

KR,

Mario

3 Replies
Anonymous
Not applicable

Hola Mario,

Puedes exportar el archivo a txt?

Qlikview tiene una excelente compresión usando txt yo cargo cerca de 18 MM de datos y durá menos de 2 min y el archivo pesa 60 MB.

Saludos,

--

Hi Mario

it's Possible to export the files to txt?

Qlikview has an excellent compression using a txt files.

Kind Regards,

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

It's generally the only way to update a QVD. If it takes too long, one approach is to segment the QVD into multiple files if possible so you are only updating a smaller file.

-Rob

marcus_sommer

Make sure that your qvd-loadings are optimized - this meant the where exists(FIELD) with only one parameter and no further transformations are applied within the load. Even by 20 M records it should be quite fast and a matter of seconds within a modern environment.

- Marcus