Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi
I am going to develop the dashboard using informatica data. Present I don't know the requirement but before that I have to find out some sample dashboard are requirements and kpi S.
I am using etl data like jobs status and source count and target count. What are the jobs and workflows success run like that.any one have idea about this please share the information .
Thanks
Hi,
If you install QlikView you can have few sample available in this location
C:\Program Files\QlikView\Examples\Documents
or you can even Qlikview demo site to download more demos.
Link to Demo site
Hi Aruna
could you please help on below requirement if you get any idea.
target date changed count the ID's.
I have table fields
table:
Issu_ID
date_upload
Taget_date
-> KPI chart if the user selects any two dates compare those two dates.if maximum date and maximum previous dates in the Taget_date field are different then caluclate the count id's.
Example:
Issue_ID Date_upload Target_date
01 01-01-2018 10-05-2018
01 03-01-2018 15-05-2018
02 01-01-2018 10-05-2018
02 03-01-2018 10-05-2018
03 01-01-2018 10-05-2018
03 03-01-2018 16-05-2018
In the above table if we select dates in the date upload field i.e., 01-01-2018 and 03-01-2018 then count is 2 because id 01,03 target dates are changed and id 02 target dates are not changed.
so just count the Issues_ID when user select any two dates(ex:01-01-2018 is min date and 03-01-2018 is max date)compare those two dates if min date targetdate is onedate and max date targetdate is change to another date than caliculate the Issues_ID count.
I need quires for the above requirements in the back-end level.and write the expression front end level also if any simple way.
I have written quires in the front end level but the performance is slow and it is showing an error in the chart i.e., time out.
Thanking you.
Can you show what you have tried in front end which is performance trouble? So, Then we will get to know what exactly presence.
Hi Anil These are my two expressions in my pie chart:
=Sum({<[Closed date] = {"$(='>=' & Date(Today()-30))"},[Job Status]={Closed}>} [Total No. of Openings])
=Sum({<[Closed date] = {"$(='>=' & Date(Today()-30))"},[Job Status]={Closed}>} [Ageing in days])
Can you explain how to modify them to show skillname-score like this (whatever you show in 1st chart ... Qlikview-60)
Hi Staffan,
I tried It's not changing.......... is there any way to write it as in expression.
And can you also suggest the solution for me that pie-chart also
Thank you in advance
Hi daisy
I implemented code in back end
LOAD *
INLINE [Issue_ID,Date_upload,Target_date
01,01-01-2018,10-05-2018
01,03-01-2018,15-05-2018
02,01-01-2018,10-05-2018
02,03-01-2018,10-05-2018
03,01-01-2018,10-05-2018
03,03-01-2018,16-05-2018
];
Left Join(data)
R1:
LOAD
Issue_ID,
Target_date,
Count(Issue_ID) as count
Resident data
Group By Issue_ID,Target_date;
in front end try below expression
=count( {< count={1} >} aggr(count( {< count={1} >} Issue_ID),Issue_ID))
find the attached App may be it useful to you
I haven't shown any where, Does it come from another link - Can you share ref, Where i have to look.
And, There is no complex in set analysis and that is so straight forward of analysis. Don't think so there is any better performance can reduce.
Hi Aruna,
thanks for your reply and I have one requirement .if you know please help me.
I am trying to above expressions. But it is not showing properly.
I am using below exp
=If(getselectedcount([Issue Location]) = 0 or getselectedcount([Issue Location])<1 ,'Issues by Geography','Issues by Region').
Actually I have two fields 1)IssueRegion 2)IssueLocation
Using both fields creating drill down and initial chart showing based on region once user click any region than chart showing the based on location(under the regions what are the location available).
By default chart showing the region wise and chart title "by region".
my requirement is when chart showing based on location and change the Chart Title "by Location"
Thanks.