Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Memory tips and tricks

Hi Everyone,

I'm currently using a 32 bit laptop with 4Gb RAM, and am increasingly hitting the ceiling when it  comes to loading data. Does anyone have any tips or tricks to optimizing data load? I have tried multi staging (up to 4 stages) as well as dropping extraneous fields but seem to run into problems more often than not.

1 Solution

Accepted Solutions
Clever_Anjos
Employee
Employee

Well, RAM expansion cards are not that expensive.

I've just noticed that you're using a Win32 into a 4Gb RAM.

Changing to a 64bits Windows will improve your performance too (Win64 can address all your memory)

View solution in original post

8 Replies
Not applicable
Author

Drop unnecessary tables as weel.

While development please load the limit to no of rows.

Debug --> Limit No of Rows -->Run  to limit the loading the no of fields from table.

Not applicable
Author

I am loading 4 tables, storing as individual .qvd files, then re-loading them all to a single table, storing that as an qvd file then re-loading asn an optimized qvd, but still begin to hit the limit

kangaroomac
Partner - Creator II
Partner - Creator II

Hiya,

What do you mean with multi staging? Incremental loads?

Dropping fields is a very good starting point - have you used the Document Analyzer (google it if you're unsure what it is) to ensure you drop all "un-needed" fields?

The best (and probably most important) thing is to ensure you have designed the correct data model. Star is preferred, as this takes advantage of the algorithms QlikView uses. It can be accomplished by concatenating or linking Fact tables, creating Mapping tables for "loosely" joined Dimension tables, Ensuring you have the tables with low granuality (i.e. instead of loading time stamps, split them to a date and a time table - less distinct fields), etc.

Also, see if you can load the data that doesn't change often (usually dimensional tables) from QVD's via an Extractor application rather than in the "main" Document.

Last but not least, if you haven't done it yet, implement incremetal loads, so that your current data is reloaded instead of all your data (else, make sure you only load data that is needed - i.e. if your sales staff only look at sales for the past 5 years, don't load 10).

Hope this helps.

Clever_Anjos
Employee
Employee

Another tip is splitting datetime fields into two parts (date and time)

Load

date(floor(datetimefield)) as datefield,

time(frac(datetimefield)) as timefield,

anotherfields

from yourqvd.qvd

Clever_Anjos
Employee
Employee

Does your application has synthetic keys? You really should avoid them if your getting into RAM consumption issues

Not applicable
Author

I've concatenate loaded the four tables to a single table so I don't have any synthetic keys at all.

I'm starting to think that I should hit the boss up for a higher powered rig

Clever_Anjos
Employee
Employee

Well, RAM expansion cards are not that expensive.

I've just noticed that you're using a Win32 into a 4Gb RAM.

Changing to a 64bits Windows will improve your performance too (Win64 can address all your memory)

Not applicable
Author

Looking at our IT dept to help now...thanks heaps