Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I want to know these is a way to limit the RAM taken by qlikview during the reload in script ?
I have a script like this
mytable1:
Load *, start_date, end_date from mytable1;
Left join load * from mytable2;
vMinDate = min(start_date) in my mytable1
vMaxDate= max(end_date) in my mytable1
Calendat:
LOAD
Date($(vMinDate)+IterNo()-1) As TempDate
Autogenerate 1 while $(vMinDate)+IterNo()-1<=$(vMaxDate);
left join IntervalMatch (TempDate) load Distinct start_date, end_date Resident mytable1;
So if my user did a mistake for entrer a start_date like 01/01/2000 and an end_date like 31/12/2999.
The RAM taken by my reload will hit the roof and exceed the max RAM available in my server.
So I want to create a limit in my reload. if my load take so much RAM and pass the limit, and I want to stop et skip this load.
Hi Xia,
It looks like,you are using existing fields in datamart and you don't have control over dates.
So try to optimize the qlikview application for better performance , which can be done at:
In general, Qlikview consumes more memory while reloading and while opening the qlikview file. This can be rectified by saving the file with lower compression.
If a Qlikview application is large in size, obviously it will consume more memory during opening the application. Splitting a single large qlikview documents into several separate documents will help in solving the issue.
Server load will be very high if concurrent users using Qlikview web server is high. It can be overcome by using network load balancer(Servers with Windows Advanced Server, IIS And QlikWebServer)
In data modeling, synthetic keys might formed if there two or more tables have common columns exist. Synthetic keys will greatly impact the performance and it is better to avoid or removing it. This can be done by removing the unnecessary links and join the tables explicitly in the script.
In some cases, reloading qlikview might take long time to load. The best practice to overcome is using a Binary Load of static historical data.
The temporary tables are used in qlikview scripting mostly for doing calculations (Resident Load). These temporary tables can be dropped once when their purpose is achieved.
Complex calculation within a dimension or expression or in any of the Qlikview objects will give poor performance and it will be better to use the complex calculations within the script of the QVW.
Avoiding resource heavy expressions or calculations greatly hinder the performance of the qlikview application. Replacing with simpler calculations will help and its a good practice too in scripting.
Example: Count(Distinct,’ Fieldname’)
Instead of the above expression, replace the count() with sum() and the distinct qualifier by assigning the value ‘1’ to each distinct occurrence as it is read in the script.
Avoid using Complex composite keys(when to remove synthetic keys), and instead use the autonumber() function to generate a sequence which uses compact memory.
Memory utilized by the qlikview objects and the calculations time can be monitored from the document properties. This will helps to identify which object is consuming more memory and finding the reason for delay in loading time of the Qlikview application.
Minimized chart objects will consume less memory comparing to the maximized one, hence use of auto minimize option will be a good practice in this case.
Screenshot showing Maximized Chart consuming memory more than the Minimized
chart:
If the chart is too large, then implementing forced selection by the user will minimize the chart calculation time.
Showing frequencies in listbox can be avoided if not necessary.
During sorting, it is recommended to sort numerical fields numerically than alphabetically.
Regards
Neetha
Hello,
you can use Limit load in the debug window to pull less amount of data instead of all .
for ex:
if Limit load is set to "10"
the result set takes only 10 records from every load(table).
Regards
Charitha
Or forcibly limit the scope of your calendar by changing the EndDate into something like:
vMaxDate = RangeMin(Max(End_Date), YearEnd(today(), 1))
In that case, you will also need to limit the individual end_dates in your facts.
Peter
Hi, thanks for your reply.
but i can't know previously how much record can i have gotten.
It is depends upon the requirement that need to be taken or not.
if it is a case, that no need of pulling all the data then can go with this way.
otherwise
mean if is a case that should load all the data for exact results then need to apply performance optimization techniques to reduce the load time.
-Charitha.