Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
felcar2013
Partner - Creator III
Partner - Creator III

load table 760 million rows

hi

i have a table 780 Million rows and 5 columns, with the following Information

when loaded isolated it takes only 16 minutes, but stucks after it. In case incremental load is not possible, what are other loading Options?

i use the deployment Framework and the Problem i have when connecting directly to the database (extract) and then when loading from QVD. I Need all the data, as per requirement and there is at the Moment no way to reduce it.

thanks

felipe

16 Replies
Marcio_Campestrini
Specialist
Specialist

Felipe

Try to split the TN_DATUM field into two different fields (date and time, if you really need both).

I don't know the period you've to load, but I like the idea of creating multiple QVDs splitting year/month data and after this load to the QVW. With this approach you can look for same load problems.

Márcio Rodrigo Campestrini
felcar2013
Partner - Creator III
Partner - Creator III
Author

hi,

there are no other loadings, and after loading the table, the application stores it and then Drops the table. there is no possibility to build a synthetic key.

i Need the Primary key to identify which customers participated in a campaign.

Probably the solution is to use incremental load at the end. I just wanted to exactly know, why after the Migration, this does not work

felcar2013
Partner - Creator III
Partner - Creator III
Author

hi

thanks, yes, i load the data as floor, not as timestamp, but will split the big table into different tables by year / month and store them as different qvds as  you mention.

marcus_sommer

It will be quite difficult to say what could be a reason without knowing what has changed by the migration - qv release, OS version, storage-environment, virtualization yes/no/changed, other things?

- Marcus

Marcio_Campestrini
Specialist
Specialist

Hi Felipe

Is your problem solved? If yes, please mark helpful/correct questions to close the thread.

Thanks.

Márcio Rodrigo Campestrini
felcar2013
Partner - Creator III
Partner - Creator III
Author

Hi Marcio,

the solution was to reduce the table to 270 Million rows, deleting old history

for new data we will use incremental load, the source table needed to be updated to allow us recognize new and modified records

regards,

Felipe

Marcio_Campestrini
Specialist
Specialist

Hi

Have you solved your problem? If you did, please choose the correct answer and help us to keep community focused.

Márcio Rodrigo Campestrini