Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi guys,
I'm trying to do an initial load of data into a qvd that includes a loop and goes through many inner joins of tables populate with billions of rows in order to get info from a certain field.
Because of the loop, it stored up to 17000 rows after 6 whole hours and then failed with no apparent reason.
Any idea why and how i can fix this?
The timeout could potentially be related to your database.
How long does the command run in your database without Qlik being involved?
How many resulting rows do you expect? Billions? or does your query return only a subset of the billions but just needs to join against them?
Hi @Dalton_Ruer . The query in MySQL needed 5,5 hours and returns all needed info.
The resulting rows I expect is a bit more than billions. My system ram is 8GB.
Still have this problem, do you know what I could do?
I figured a ton of joins for that many rows would take forever.
My suggestion would be to bring back each table independently of the others, store them into a QVD, and then drop the table from memory before getting the next. Then figure out how to best join/associate the data inside of Qlik from the QVD's.
The following code block is one flavor of how you could do that. Simply replace Table1, Table2, Table3 with the list of your real table names.
Tables:
Load * Inline [
table_name
Table1
Table2
Table3
];
LET NumRows=NoOfRows('Tables');
FOR i=1 to $(NumRows)
LET vTable=FieldValue('table_name',$(i));
// ✋ - Trace step is only added here in case someone has wants to call out information that can be read via a log reader
Trace Loading data for table $(i) of $(NumRows) and the name is $(vTable);
[$(vTable)]:
SQL Select *
From o2c_v2_dm.$(vTable);
Let vStoreat = '[lib://{your qvd location}/'&vTable&'.qvd] (qvd)';
Store $(vTable) into $(vStoreat);
Drop Table $(vTable);
NEXT;
Will try @Dalton_Ruer , task log writes
2021-05-11 23:08:22 Error: Connector reply error: SQL##f - SqlState: S1000, ErrorCode: 2008, ErrorMsg: [MySQL][ODBC 5.3(w) Driver][mysqld-5.5.58-0+deb8u1-log]MySQL client ran out of memory
2021-05-11 23:08:22 Execution Failed
2021-05-11 23:08:22 Execution finished.
I can imagine if you have a lot of fields, text fields with large text that a billion plus rows might exceed memory when you only have 8GB.
Writing each table to a QVD will allow you to deal with smaller sets of data so you can complete the SQL Query, store the data and move on. Then you can approach the problem as a classic On Demand Application Generation (ODAG) issue. You want to provide a primary application that allows you to choose a COHORT of values (data for a certain product, year, company etc.) and a template application that will then load the data from your QVD's as needed rather than the billion+.
This post I write a few months ago will help guide you through the concept more concretely. https://dataonthe.rocks/dynamic-views-in-qlik-sense-saas/
Ignore the fact that I talk about SaaS, and that the use case in the example needed ODAG's newest flavor Dynamic Views. Your use case is exactly why Qlik invented ODAG. More data than can be displayed in a single application.