Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
felcar2013
Partner - Creator III
Partner - Creator III

load table 760 million rows

hi

i have a table 780 Million rows and 5 columns, with the following Information

when loaded isolated it takes only 16 minutes, but stucks after it. In case incremental load is not possible, what are other loading Options?

i use the deployment Framework and the Problem i have when connecting directly to the database (extract) and then when loading from QVD. I Need all the data, as per requirement and there is at the Moment no way to reduce it.

thanks

felipe

1 Solution

Accepted Solutions
felcar2013
Partner - Creator III
Partner - Creator III
Author

Hi Marcio,

the solution was to reduce the table to 270 Million rows, deleting old history

for new data we will use incremental load, the source table needed to be updated to allow us recognize new and modified records

regards,

Felipe

View solution in original post

16 Replies
Peter_Cammaert
Partner - Champion III
Partner - Champion III

When you get stuck loading, does the QlikView application/server become unresponsiv,e or do you get an error message?

Does your QlikView platform have enough memory to handle this amount of data?

Peter

trdandamudi
Master II
Master II

Based on my past experience, I think it is because of the memory. How much memory do you have ?

felcar2013
Partner - Creator III
Partner - Creator III
Author

hi

there is no error. see below. after loading appears "x lines fetched" and then it stucks, for over 20 hours

felcar2013
Partner - Creator III
Partner - Creator III
Author

164 GB RAM

when i load it from the QMC it uses 90% of the Memory, while when doing this manually it takes only 9% of Memory

marcus_sommer

The RAM consumption isn't quit clear with this informations. I suggest you make a restart from the qlikview server services, starts this reload again and monitor the RAM consumption carefully within the taskmanager and you should also pay attention for other activities on this server then not only the qvb.exe will require RAM as well as the qvs.exe and perhaps some other services, too.

- Marcus

felcar2013
Partner - Creator III
Partner - Creator III
Author

i restartet the Services and realized that the applicaton loads the table, the Memory usage goes slowly up, until it reaches over 90%.

the issue is that this Problem occurs since there was a Migration on the database Server, where the table Comes from. Before QV loaded this table with no Problems.

marcus_sommer

The stucking is clear - there is not enough RAM available. But it looked a bit strange for me that a qvd with 780 M records and only 5 columns takes so much RAM - are you sure that's only 5 columns and you don't performed a wildcard-load on more columns? What kind of data contain those columns?

- Marcus

felcar2013
Partner - Creator III
Partner - Creator III
Author

the table contains a Primary key, and the other columns contain no more than 1% of distinct values from total Primary key

i loaded this table isolated directly from SQL and then in the tranform file. in both cases after loading all the data rows, it stucks, where QV Shows the " ..... lines fetched". In the Transform file, the "date" field is transformed with a "floor". The table is loaded really fast, less than 15 minutes (in extract and transform file), but then it stucks and keeps with "... lines fetched". Strange for me is that before our Migration took place, this was not the case. User rights where changed / migrated, Publisher updated and then came the Problem. Is there any place where we can check? (Server side / qv)?

marcus_sommer

Are there other loadings within these application? If yes there could be occur synthetic keys and/or circular references between the tables and for them could qv need a long time to calculate (by 780 M records many hours and very probably will it need more RAM then available). You could check this within the debugger or with a FIRST statement to limit the data - maybe on 100 records and you will see what happens within the table-viewer.

Another thought is to leave the primary key out of these loadings/tables then are there really reasons for them if you didn't apply an incremental loading?

- Marcus