Hello community,
this is the first "BIG" load i do on the new machine, which lead to a "out of memory" message in qlikview (while QV is using 0,9 GB on a 16GB machine).
I hope there is a solution, because qlikview was advertised to work with much bigger data (>64GB).
The Machine is a AMD Phenom II X6 1090T (6x 3,2 Ghz) on a normal ASUS Mainboar, 16GB Corsair RAM. (Should be sufficient for QV)
Running Windows 7 Professional (64). QV 9.0 SR5
While using only 16% CPU (Using all cores) and only 3 of 16 GB Ram at all, .it shows "OUT OF MEMORY" (Ignoring at least 12GB of free RAM).
The script loads a .qvd file (ca. 100 MB, ca. 700.000 Rows of Data) and transforms the content to a different format
The basic idea is, to create a collumn per unit, and put the values there.´
Start:
ID | Unit | Value |
---|
101 | TempA1 | 32 |
101 | TempB1 | 33 |
Result:
ID | TempA1 | TempB1 |
---|
101 | 32 | 33 |
102 | 35 | 34 |
Its only 1 table, and no difficult calculations are made, only values are shifted from A to B.
Is there a Problem with 9.0 running on Win7 64 ? I see, this is a common problem on QV 9x installations, is this resolved in v10 or v11 ?
The previus code i posted, contained too much unneccessary information.
The shortened Code is here:
//#// Content ( UNITs ) of exportet file
B2conv:
LOAD
NAME,
ID_NR,
TYP,
// Kalender
ZEIT,
Jahr,
Monat,
Tag,
Stunde,
TempA1,
TempB1,
TempA2,
TempB2,
TempA3,
TempB3,
(TempA1,-Temp_old) as Delta,
(TempA1/TempB1) as Delta_1,
(TempA2/TempB2) as Delta_2,
(TempA3/TempB1) as Delta_3,
(B1/B2-1) as Abw_1, //
(D1/D2-2) as Abw_2,
(C1/C2-3) as Abw_2
// some more values, but no calculations
RESIDENT cross_B2;
drop table cross_B2;
STORE B2conv INTO $(vQVDStore)conv_B2.QVD;
Any ideas are welcome