Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi Folks,
We are facing serious issue with server memory utilization, sometimes it is exceeding more than 100% and this leads to server machine crash.
Although we have restricted memory utilization from QMC max upto 80% of the total machine memory. Finally we found reason for this from the Audit log file of QlikView server that one of our application chart is causing this issue . When user trying to access this chart, it's not responding for 15-20 mints and during this time frame we can see from the task manager that memory increasing from 20,30% upto 100% and crashing server.
We are using below expression in chart :
Sum ({<Complaint_Acceptance-={4}>}QTY_PPM*Top_Pos_Flag)/
if(Complaint_Type=3 or Complaint_Type=1, aggr(nodistinct Sum (QTY_Delivered_TOTAL), Month),
if(Complaint_Type=2 or Complaint_Type=7, aggr(nodistinct Sum (QTY_Received_TOTAL),Month),0))
*1000000
Please suggest what shall we do for this expression so that chart will respond in short period of time like other.
I tried this expression to get rid off aggr function , but output is not matching.
Sum ({<Complaint_Acceptance-={4}>}QTY_PPM*Top_Pos_Flag)/
sum(
if(match(Complaint_Type, 1, 3), QTY_Delivered_TOTAL,
if(match(Complaint_Type, 2, 7), QTY_Received_TOTAL,
0))
)
*1000000
Thanks,
AS
Hi Amit,
Try creating the new object and check, sometimes it works.
Regards,
Jagan.
Without trying this with your data, I would assume the if-statement is your problem.
What if you, in your script, create a new field, called QTY_Total, and use that in your chart:
if(Complaint_Type=3 or Complaint_Type=1, Sum (QTY_Delivered_TOTAL),if(Complaint_Type=2 or Complaint_Type=7, Sum (QTY_Received_TOTAL),0)) AS QTY_Total
Then you should be able to remove the if's, writing your expression like:
Sum ({<Complaint_Acceptance-={4}>}QTY_PPM*Top_Pos_Flag)/aggr(nodistinct Sum (QTY_TOTAL), Month))*1000000
Does that make a difference?
Maybe like this:
Sum ({<Complaint_Acceptance-={4}>}QTY_PPM*Top_Pos_Flag)/
alt(pick(Complaint_Type,
sum(total <Month> QTY_Delivered_TOTAL),
sum(total <Month> QTY_Received_TOTAL),
sum(total <Month> QTY_Delivered_TOTAL),
0,
0,
0,
sum(total <Month> QTY_Received_TOTAL)
),0)
*1000000
Are you sure you want to divide by zero for Complaint_Type other than listed?
It's hard to help you without knowing a little more about the context of your expression (in which chart you are using it, with what dimensions etc.) and your data model.
In general,
- use aggregation functions
- take care that your expression does not need to create JOINS between large tables, especially JOINS with no key values in common
Yes Swuehl,
We are doing this based business requirement.
Unfortunately this application datamodel is very complex. Let me follow your suggestions , hope this will help me to solve the problem.
Thanks,
AS
Hi Gysbert,
Please see below :
Now only issue with this graph , this expression below is causing issue :
Sum ({< Complaint_Acceptance-={4},Month=,Year=, Day=, Year_Month=, NumDate = {">=$(vRollingDate)<=$(vRollingDateEnd)"} >} QTY_PPM*Top_Pos_Flag) /
Sum ({<Complaint_Type={3,1},Month=,Year=, Day=, Year_Month=, NumDate = {">=$(vRollingDate)<=$(vRollingDateEnd)"} >} QTY_Delivered_TOTAL)*1000000
where
vRollingDate=num(MonthStart(AddMonths(max(NumDate),-11)))
vRollingDateEnd==floor(num(MonthEnd(max(NumDate) ) ))
Thanks,
AS
Hi Folks,
Please help me on below expression , it's consuming more than 100% of machine memory.
Sum ({< Complaint_Acceptance-={4},Month=,Year=, Day=, Year_Month=, NumDate = {">=$(vRollingDate)<=$(vRollingDateEnd)"} >} QTY_PPM*Top_Pos_Flag) /
Sum ({<Complaint_Type={3,1},Month=,Year=, Day=, Year_Month=, NumDate = {">=$(vRollingDate)<=$(vRollingDateEnd)"} >} QTY_Delivered_TOTAL)*1000000
where
vRollingDate=num(MonthStart(AddMonths(max(NumDate),-11)))
vRollingDateEnd==floor(num(MonthEnd(max(NumDate) ) ))
Is there any better way to write this expression????
Thanks
AS
I don't see a way to optimize that expression other then leaving out anything that isn't strictly needed. I suspect your problem is the data model of your document. And possibly the amount of data in your document compared to the amount of ram your document has.
Yes you are right I'm having very complex datamodel.
Looks to me I have to work on this.
Thanks,
AS