Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Loading issues

regarding loading because of the limited ram we get errors.how to resolve these type of issues.

putting condition and doing concatenating the qvd after loading.

8 Replies
morgankejerhag
Partner - Creator III
Partner - Creator III

It depends on the circumstances. If you have too much data then a filter to select less data could work. If you make some in ram heavy calculation you can try to rewrite it or calculate on smaller portions of data. What is the script doing when it crashes?

Not applicable
Author

Thank you.Is there any alternative with out putting where condition.limiting the size etc???

morgankejerhag
Partner - Creator III
Partner - Creator III

You can aggregate the data to some level.

ramasaisaksoft

Hi venu,

       1)Remove unnecessary columns loading from u r source file

       2) If any calculations  are there in Load script remove those

       3) it is better to fetch data from .qvd files("Incremental Load"),if data size is high(Megabytes,Gb)

       4)it is better fetch data using "Binary load"(.Qvw) file loading

vardhancse
Specialist III
Specialist III

While loading data from any table, loading only required data will help us out to decrease the load while reload.

like left keep/join and all so that only in fact table the complete data set will be there, and in dim tables only related data from fact will be loaded.

Removing un used fields will help out to increase performance.

more data fields like subject are not required in reporting and so removing them will help out.

using autonumberhash functions will help out in dec the storage of data for primary keys.

Not applicable
Author

Hello Venu,

1)Remove unnecessary columns loading from u r source file (or) by using  Applymap get the required fields.

2)Historical data is required or not? check once.

3) Instead of loading hole date apply partial reload.

4)Try to avoid using Resident tables, instead read the data from QVD twice.

JonnyPoole
Employee
Employee

another little one is that If you have timestamps (date fields that store date and time information), its helpful to aggregate to just a date

load

     date( floor( timestamp))

from ....


floor will remove the time aspect and deal with whole days only.


This remove uniqueness and allows for more compression by qlikview.


you can also do similar tricks on phone numbers... 


storing a 9 digit field is highly unique , but you will have more repetition if you break up the components into 2-3 fields


store a telephone field like this

222-111-5555


as

Phone1     Phone2     Phone3

222          111              5555

Not applicable
Author

Venu,

If you are loading multiple tables, i.e. a fact and several dimensions, make sure you are loading fact first then dimensions with a where exists to limit the dimension load to only those in the fact.

Also beware of "snowflake" tables as they can hit memory hard sometimes.

Shawn