I'm working on a project that requires a large database table (300,000,000+ records) to be converted into a QVD file. If I attempted to load the entire thing into the QVD instantly, my computer locks up (it runs out of memory) and the script can't complete. Therefore, I devised the plan of loading X number of records from the table into the QVD at a time (where X is some number of records that my computer can handle).
This is done using the following loop:
DO WHILE (done = 1);
LET currentBreak = currentBreak + 1;
// Loads the next X records from the database
LOAD * WHERE RECNO() >= ($(currentPointer($(currentBreak), $(rowsToLoad)))) AND RECNO() < ($(nextPointer($(currentBreak), $(rowsToLoad)))); // Loads all records between points X and Y
SQL SELECT * From database ORDER BY id;
// Loads the QVD and automatically concatenates it to the above selection from the Database.
LOAD * FROM myQVD.qvd (qvd);
STORE TABLE2 INTO myQVD.qvd (qvd);
LET record = PEEK('id', $(nextPointer($(currentBreak), $(rowsToLoad))) - $(currentBreak) - 1, database);
DROP TABLE TABLE2; // Done to clear RAM
IF(len('$(record)') > 0) THEN
LET done = 1;
LET done = 0;
This works fine on small tables and has the expected functionality. The problem is that when I try to apply it to the large table I get the same 'out of memory' crash as when I try to load the table in a single pass, leading me to believe that QV is loading the entire table in the select statement and THEN filtering out all the records that don't fall between the two cutoff points.