Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
QlikView documentation and resources.
Hi All,
Sometimes there may be requirements from users where they want to see the charts in from certain point of time like YTD, QTD, MTD, Last 5 Years etc., please find the set analysis expressions for this type of scenarios.
YTD Sales (Year To Date)
Sum({<Year=, Quarter=, Month=, Week=, Date={‘>=$(=YearStart(Today()))<=$(=Today())’}>} Sales )
Note: Year=, Quarter=, Month=, Week= excludes the selections in Year, Quarter, Month and Week dimensions.
QTD Sales (Quarter To Date)
Sum({<Year=, Quarter=, Month=, Week=, Date={‘>=$(=QuarterStart(Today()))<=$(=Today())’}>} Sales)
MTD Sales (Month To Date)
Sum({<Year=, Quarter=, Month=, Week=, Date={‘>=$(=MonthStart(Today()))<=$(=Today())’}>} Sales)
WTD Sales (Week To Date)
Sum({<Year=, Quarter=, Month=, Week=, Date={‘>=$(=WeekStart(Today()))<=$(=Today())’}>} Sales)
Last 5 Years Sales
Sum({<Year=, Quarter=, Month=, Week=, Date={‘>=$(=YearStart(Today(), -4))<=$(=Today())’}>} Sales )
Last 6 Quarters Sales
Sum({<Year=, Quarter=, Month=, Week=, Date={‘>=$(=QuarterStart(Today(), -5))<=$(=Today())’}>} Sales )
Last 12 Months Sales
Sum({<Year=, Quarter=, Month=, Week=, Date={‘>=$(=MonthStart(Today(), -11))<=$(=Today())’}>} Sales )
Last 15 Weeks Sales
Sum({<Year=, Quarter=, Month=, Week=, Date={‘>=$(=WeekStart(Today(), -14))<=$(=Today())’}>} Sales )
Last 10 Days Sales
Sum({<Year=, Quarter=, Month=, Week=, Date={‘>=$(=Date(Today()-9))<=$(=Today())’}>} Sales )
Yesterday Sales
Sum({<Year=, Quarter=, Month=, Week=, Date={‘$(=Date(Today()-1))’}>} Sales )
You can also arrive some flags for above scenarios in script and you use those flags in Set Analysis expression if your data always based on Current Date. Refer below link created by Richard.Pearce60
Calendar with flags making set analysis so very simple
Hope this helps.
Regards,
Jagan.
Calculating working hours is an essential measure in many business scenarios. In this post, I will demonstrate the steps to calculate this measure. Additionally, I will cover other aspects related to working hours, such as calculating overtime and excluding lunch breaks.
Note:
The example in this post assumes weekends are Saturday and Sunday. If your weekends fall on different days, please refer to the post linked below.
Working-days-and-hours-calculations-for-custom-weekends
Consider the following case:
Suppose a ticket is logged into the system on a certain day, referred to as the Start Date, and the same ticket is resolved on a different day, referred to as the End Date. We may want to calculate the working hours between these two dates to assess the efficiency of ticket closure time.
Here is how you can calculate it within the script
1) Calculate business working hours excluding week ends(Saturday and Sunday) and Holidays.
Following are the considerations
1) Count only standard working hours ( 9 AM - 6 PM) - You can change accordingly
2) Exclude Saturdays and Sundays
3) Exclude Holidays
You can adjust the date format of the variables below according to actual format in your data. Then use timestamp() function to represent it in required format.
SET TimeFormat='hh:mm:ss';
SET DateFormat='DD/MM/YYYY';
SET TimestampFormat='DD/MM/YYYY hh:mm:ss';
Set up the variables below for standard working hours on weekdays. You can adjust the variables according to your working hours (e.g., 9 AM - 6 PM), and the rest of the calculations will be done automatically.
// Set the start and end hour of the day in 24 hour format
LET vStartHour = 9;
LET vEndHour = 18;
LET vWorkingHourPerDay = $(vEndHour) -$(vStartHour);
Set up the holiday list as shown below. Feel free to use your own holiday list.
Holidays:
LOAD Concat(chr(39)&Holidays&chr(39),',') as Holidays Inline [
Holidays
08/03/2016
09/03/2016
17/08/2010
];
LET vHolidays = Peek('Holidays',0,'Holidays');
Following is the logic to calculate the business working hours between to dates
Data:
LOAD *,
rangesum(
NetWorkDays(START_TIME+1,END_TIME-1,$(vHolidays)) * MakeTime($(vWorkingHourPerDay)), // In between hours
if(NetWorkDays(END_TIME,END_TIME,$(vHolidays)),
Rangemin(rangemax(frac(END_TIME),maketime($(vStartHour))),maketime($(vEndHour)))-
Rangemax(rangemin(frac(END_TIME),maketime($(vStartHour))),maketime($(vStartHour))),0), // working hours last day
if(NetWorkDays(START_TIME,START_TIME,$(vHolidays)),
Rangemin(rangemax(frac(START_TIME),maketime($(vEndHour))),maketime($(vEndHour)))-
Rangemax(rangemin(frac(START_TIME),maketime($(vEndHour))),maketime($(vStartHour))),0), // working hours first day
if(NetWorkDays(START_TIME,START_TIME,$(vHolidays)) and floor(START_TIME)=floor(END_TIME),-MakeTime($(vWorkingHourPerDay))) // If same day then correct the hours
)*24 AS Business_Hrs_Without_Overtime,
rangesum(if(NetWorkDays(START_TIME,START_TIME,$(vHolidays)) ,
rangesum(if(frac(START_TIME)<maketime($(vStartHour)),maketime($(vStartHour))-frac(START_TIME),0),
if(frac(END_TIME)>maketime($(vEndHour)),frac(END_TIME)-maketime($(vEndHour)),0))*24)) as Overtime ; // Overtime
LOAD *,
timestamp(StartTime,'DD/MM/YYYY hh:mm:ss TT') as START_TIME,
timestamp(EndTime,'DD/MM/YYYY hh:mm:ss TT') as END_TIME
Inline [
TicketNo,StartTime, EndTime
1, 8/25/2010 3:00:00 PM, 8/27/2010 6:00:00 PM
2, 8/16/2010 10:00:00 AM, 8/17/2010 1:00:00 PM
3, 8/17/2010 1:30:00 PM, 8/17/2010 2:45:00 PM
4, 8/17/2010 3:00:00 PM, 8/18/2010 5:00:00 PM
5, 8/18/2010 5:01:00 PM, 8/19/2010 4:00:00 PM
6, 8/19/2010 5:00:00 PM, 8/20/2010 10:00:00 AM
7, 8/20/2010 11:00:00 AM, 8/20/2010 5:00:00 PM
8, 8/23/2010 2:00:00 PM, 8/23/2010 4:00:00 PM
9, 8/23/2010 5:00:00 PM, 8/23/2010 6:00:00 PM
10, 8/24/2010 7:00:00 AM, 8/24/2010 2:00:00 PM
11, 8/20/2010 5:30:00 PM,8/23/2010 1:00:00 PM
12, 3/7/2016 4:00:00 PM, 3/10/2016 6:00:00 PM
13, 8/19/2010 11:00:00 AM, 8/20/2010 6:30:00 PM];
DROP Fields StartTime, EndTime;
You can then create measures to display working hours. Use the measure below if you want to present the working hours in hh:mm:ss format.
=interval(sum(Business_Hrs_Without_Overtime)/24,'hh:mm:ss')
2) Calculate business working hours excluding week ends(Saturday and Sunday), Lunch Breaks and Holidays.
Below are the considerations
1) Count only standard working hours ( 9 AM - 6 PM)
2) Exclude Saturdays and Sundays
3) Exclude Lunch Break (1 PM - 2PM)
4) Exclude Holidays
Set the Variables for standard working hours and lunch breaks. You can change the values according to your needs
// Set the start and end hour of the day in 24 hour format
LET vStartHour = 9;
LET vEndHour = 18;
LET vLunchStart =13;
LET vLunchEnd =14;
LET vWorkingHourPerDay = ($(vEndHour) -$(vStartHour))-($(vLunchEnd)-$(vLunchStart));
Include the Holidays
// Include the holidays list
Holidays:
LOAD Concat(chr(39)&Holidays&chr(39),',') as Holidays Inline [
Holidays
08/03/2016
09/03/2016
];
LET vHolidays = Peek('Holidays',0,'Holidays');
Following is the logic to calculate the business working hours between two dates
Data:
LOAD *,
rangesum(
NetWorkDays(START_TIME+1,END_TIME-1,$(vHolidays)) * MakeTime($(vWorkingHourPerDay)), // 12 hours per workday, for all day inbetween the period, excluding bounderies
if(NetWorkDays(END_TIME,END_TIME,$(vHolidays)) ,
rangesum(rangemin(frac(END_TIME),MakeTime($(vLunchStart)))- rangemin(frac(END_TIME),MakeTime($(vStartHour))) ,
rangemin(frac(END_TIME),MakeTime($(vEndHour))) - rangemin(frac(END_TIME),MakeTime($(vLunchEnd)))),0), // working hours last day
if(NetWorkDays(START_TIME,START_TIME,$(vHolidays)),
rangesum(MakeTime($(vLunchStart)) - rangemin(rangemax(frac(START_TIME), MakeTime($(vStartHour))),MakeTime($(vLunchStart))),
MakeTime($(vEndHour)) - rangemax(rangemin(frac(START_TIME), MakeTime($(vEndHour))),MakeTime($(vLunchEnd)))),0), // working first day
if(NetWorkDays(START_TIME,START_TIME,$(vHolidays)) and floor(START_TIME)=floor(END_TIME),-MakeTime($(vWorkingHourPerDay))) // If the same day then correct the hours
)*24 AS Business_Hrs_Without_Overtime,
rangesum(if(NetWorkDays(START_TIME,START_TIME,$(vHolidays)) ,
rangesum(if(frac(START_TIME)<maketime($(vStartHour)),maketime($(vStartHour))-frac(START_TIME),0),
if(frac(END_TIME)>maketime($(vEndHour)),frac(END_TIME)-maketime($(vEndHour)),0))*24)) as Overtime ; // Overtime
LOAD *,
timestamp(StartTime,'MM/DD/YYYY hh:mm:ss TT') as START_TIME,
timestamp(EndTime,'MM/DD/YYYY hh:mm:ss TT') as END_TIME;
LOAD * Inline [
TicketNo,StartTime, EndTime
1, 8/16/2010 7:00:00 AM, 8/16/2010 7:00:00 PM
2, 8/16/2010 10:00:00 AM, 8/16/2010 1:30:00 PM
3, 8/16/2010 9:00:00 AM, 8/16/2010 2:00:00 PM
4, 8/16/2010 11:00:00 AM, 8/16/2010 1:00:00 PM
5, 8/16/2010 1:15:00 PM, 8/16/2010 1:45:00 PM
6, 8/16/2010 3:00:00 PM, 8/16/2010 7:00:00 PM
7, 8/16/2010 1:30:00 PM, 8/16/2010 6:00:00 PM
8, 8/16/2010 2:00:00 PM, 8/16/2010 7:00:00 PM
9, 8/16/2010 5:00:00 PM, 8/16/2010 6:00:00 PM
10, 8/16/2010 7:00:00 AM, 8/16/2010 1:00:00 PM
11, 8/16/2010 9:30:00 AM,8/16/2010 11:00:00 AM
12, 8/16/2010 1:00:00 PM, 8/16/2010 1:34:00 PM
13, 8/16/2010 2:00:00 PM, 8/17/2010 7:00:00 PM
14, 8/16/2010 1:00:00 PM, 8/17/2010 6:00:00 PM
15, 8/16/2010 9:00:00 AM, 8/17/2010 1:00:00 PM
16, 8/16/2010 3:30:00 PM,8/17/2010 2:00:00 PM
17, 8/16/2010 7:00:00 AM, 8/17/2010 5:00:00 PM
18, 8/17/2010 10:00:00 AM, 8/19/2010 5:00:00 PM
19, 8/17/2010 3:00:00 PM, 8/19/2010 4:00:00 PM
20, 8/19/2010 1:00:00 PM, 8/24/2010 11:00:00 AM
];
Please refer to the attached applications.
Please feel free to share any suggestions.
Overview
For those of us that work with QlikView distribution service, either the standard version or Publisher then perhaps this model may be of some use in analysis of your tasks.
As a Qlik partner, I developed this to read in the various .xml files created by the distribution service and help look for some common problems such as:
There are lots of clever ways of analysing the QVPR data, including through the Governance dashboard, however I went with this approach as it gave me something tangible and reasonably simple to interpret and then present to a client.
Instructions
Considerations
I have used in a mix of QlikView 11 and 12 environments using both Publisher and the standard distribution service though I can't account for all environments so the model is provided as is where is, though please feel free to expand for your own use or provide some feedback that may benefit the community.
Thanks for taking the time to read this and I hope you find it useful.
Generic Keys is a way to define keys between tables in a more general way so that their values can represent other things than individual key values; they can represent groups of key values or any key value. As an example, you can combine product IDs, product group IDs and a symbol for all products into one key field.
You can use generic keys to solve many data modeling problems:
See more in the attached files.
PS I have been asked to make an example on comparing budget and actual numbers, so here it comes as a second attachment. It is a zip file with a mock-up order database with a budget. Create a new empty qvw; expand the zipped files in the same folder as the qvw; insert one of the two included script files into the qvw and run the script.
Here is an easy approach to extracting the data model from a QlikView (QVW) file. The example documented uses QlikView Desktop, if you are using Qlik Sense then please click here.
Why would you use or need this kind of technique?
In a production environment you may find use of the output table script elements without the variables as they can be useful in load processes when transformed QVDs can be re-used by other applications. However, there are many times where you need to create something quickly for an ad-hoc piece of work or as a demonstration to show a customer with amended data to make it more relevant (changing product, department names etc). This approach can also be useful when working offline on the front end or application tuning where you do not want to deal with all of the complications of the ETL process.
Step 1 - Enter the file name and path for the QVW from which you wish to extract data
Step 2 - Enter the destination folder path where the extract data will be saved
Step 3 - Select the output format type
Step 4 - Save and reload the application
Please note these instructions are also included in the dashboard itself in case you forget where it came from. This is what the dashboard looks like.
It's as easy as that.
A derivative of the QlikView System Monitor for versions 10/11, finally the QlikView 12-compatible version has arrived. The UI has undergone some changes and the structure/setup has been simplified. This application will read your QVS machine logs and output all kinds of information for you, such as virtual memory warnings, PGO/.Shared errors, user trending and utilization, and chronological logging events across the system.
If you're looking for the QV 11.2 version, please find it here.
Thanks for your patience and if you have any questions/comments please post to this thread!
Best,
MT !
Hi,
The below code helps in replacing characters with ASCII codes in a string in Load script
For example
ABC is converted to 656667- since ASCII code of A is 65, B is 66 and C is 67.
CharMap:
Mapping LOAD
Chr(RecNo() - 1) AS Char,
RecNo() - 1 AS Asciicode
AutoGenerate 256;
Data:
LOAD
Text,
MapSubString('CharMap', Text) as formattedText
FROM DataSource;
Hope this helps.
Regards,
Jagan.
Dimensions and calculations over periods are essential parts from nearly each reporting. The analysis from data regarding to periods is quite independent from the to analyse data-area regardless if this are sales-, finance-, human-ressources- or production-data. Nearly nothing is more exciting within an analysis as the development from data over the time and the questions which are following like: Was these development expected or not and what could be the reasons?
However the handling from time-data could be difficult whereas the most could be avoided with a few simple rules.
The easiest way is often to use a master-calendar as dimension-table which is linked to the fact-table(s). Why and how, see:
The Fastest Dynamic Calendar Script (Ever)
Master Calendar with movable holidays
In more complex data-models is it often necessary to create several calendars and/or to use calendars which are divergent to normal year-calendars.
Why You sometimes should Load a Master Table several times
Fiscal Calendar with Non-Standard Days (Not 1-31)
Important is to define and formate the time-dimension properly. Properly meant that the dimensions are (also) numeric then only numeric values could be calculated respectively compared with each other.
Background is that the date of 12/31/1899 is equal to 1 and each further day will be added by one so that the date of 12/31/1999 corresponds to 36525. Hours/Minutes/Seconds are fractions from 1, for example 1 / 24 / 60 = 0,000694 is equal to 1 minute.
This meant that all fields which should be calculated (comparing is calculation, too) should be (additionally) available as numeric field or as Dual-Field:
Often are additionally relative and/or continuing time-dimensions and flags very helpful to avoid complex calculations:
Creating Reference Dates for Intervals
Calendar with flags making set analysis so very simple
Period Presets: Compare Periods on the fly
Subroutine to Create Data Model for From/To Date Selection
Calendar with AsOf Flags, Compare Easter to Easter
Beside simple but frequent time-comparing with one or several time-dimensions in one object and simple expressions like sum(value) or count(order) are there more complicated questions like:
Previous YTQ, QTD, MTD and WTD
Calculating rolling n-period totals, averages or other aggregations
Beside the above used links you will find many interessting postings here within the qlik community to these topic - the notes here are a good starting point to go further.
Have fun!
ps: within the attachment is a german translation - deutsche Fassung.
Hi All,
This document helps you in loading multiple excels and excel sheets with the name of the sheet and data.
//to read each file from the specified folder
FOR EACH file in FileList('filepath\*.xlsx');
//In order to get the file information from SQLtables command making use of the ODBC connection format
ODBC CONNECT32 TO [Excel Files;DBQ=$(file)];
tables:
SQLtables;
DISCONNECT;
FOR i = 0 to NoOfRows('tables')-1
LET sheetName = purgeChar(purgeChar(peek('TABLE_NAME', i, 'tables'), chr(39)), chr(36));
Table:
Load * ,
FileBaseName()as FIle,
FileDir() as Dir,
FileName() as File_Name,
'$(sheetName)' as Sheet_name
From $(file)(ooxml, embedded labels, table is [$(sheetName)]);
NEXT i
Next
Hope this helps !!!
please find the attachment for the eg: qvw and test fiels
Regards,
Attached is a QVD file with 249 countries tagged as officially accepted by ISO.
The country names and codes are compatible with for example Google's GeoChart extensions.
Fields included in the QVD are...
- Two letter country code, ISO 3166-1 Alpha2 code
- Three letter country code, ISO 3166-1 Alpha3 code
- Official country name
- Shorter version of country name, more suited for presentation
- Most common regions in the world
Primary ISO code source is ISO.org; ISO 3166-1 decoding table - ISO 3166 Maintenance agency - ISO
Attached is an example of stock aging in SAP. In this example I have used the MSEG (Document Segment) and MKPF (Material Document Header) tables to identify both the current stock position and the purchases (the In movements of stock). The final stock value can then be distributed across the most recent stock in movement records to assign a date as to when this stock was received (based on a FIFO - first in first out methodology). The MKPF table is required as there was an incomplete set of records in the BUDAT (stock-in date) field in the MSEG table in this implementation (this may be different at other sites).
The basic process is as follows:
As this involves a bit of logic to look at preceeding values etc I have attached an example which shows the 4 steps in the coding that you can copy and reuse. The example is based on the QVDs as they would be generated using the SAP Connector script generator** so if you have used this it should be an easy reload of this data.
**Note: the MATNR field is renamed to make it a key in MSEG as [%MATNR_KEY].
Any clarifications or improvements please feel free to add in comments below.
Hello Everyone,
I have attached the document for the important Qlikview functions used in script as well as in UI.
Please have a look and also feel free to update the document or comment in the session for the functions which is missed.
As QlikView applications grow the number of tabs that information is spread across can grow rapidly as well. The new Ajax view makes it easier to navigate when there are many tabs (with the drop down) but if users are using the IE Plugin or an older version of QlikView a lot of the screen can be taken up with tabs.
One way of solving this is to group the tabs into functional areas and place a menu on the welcome tab that allows the user to select which functional area they want to look at.
Furthermore, if some tabs are simply not relevant to some users then it is possible to hide those tabs (and the menu options to show them) from those users.
This document gives an example of a menu that switches tabs on and off and implements hiding of tabs from users based on their OSUser name - loaded from an Inline table in the load script.
I hope you find this document useful, you can find links to other documents I have uploaded here: http://www.quickintelligence.co.uk/qlikview-examples/
Steve
One of our Qlik Community Buddy (Rupinder Singh) shared an awesome way to optimize the performance of group by statement but I don't think this got enough lime light which it should have. In order to give it better exposure, I am creating this document from this thread Optimize Group By
For below I am using QV version - 11.2 SR4
I have always had trouble with aggregating my data as "Group by" statements in scripts have always taken too long for my comfort.
For example, to group by 23 million records (2 columns - ID and Sales) it takes me 10 mins.
Table_Group:
Load ID,Sum(Sales)
Resident Table
Group By ID;
Instead of this, If I Order by my data by ID and then run the same Group by clause, I get results backs in less that 2 mins.
Table_Sort:
Load ID,Sales
Resident Table
Order by ID;
Table _Group:
NoConcatenate Load ID,Sum(Sales)
Resident Table_Sort
Group By Table_Sort;
I would have believed QV to internally perform this step, but explicitly implementing this has improved my query times by 500%.
I have tables with 40 columns (36 of which to be grouped by and 4 to be summed) and the query time has gone from 18 to 6 mins.
I have just ensured that my Order By column list is the same as Group By Column List.
So, If Order by is "Order By Column1,Column2,Column3" then Group by is "Group By Column1,Column2,Column3"
I would love to hear back if others in the community also have faced similar challenges and whether this or any other method has helped them improve Group by Performance.
Having in the past used the ODBC method of getting .xlsx sheetnames (How to load EXCEL worksheets and retrieve tab names), I recently worked for a client who didn't have access to be able to install the ODBC driver required on the server.
I was eventually able to get on OLEDB connection working, this shouldn't require an installation either on the client or server side.
The code below will read data from the excel file specified in the vFilePath variable.
It then reloads the data in the tables2 resident load and excludes the sheet names we do not to want to load.
Finally we use a For Each Loop to loop through the sheets and load them into Qlikview.
Hope this helps someone!
Let vFilePath = '\\path\to\my\file\';
//Get the Sheet Name of the file
FOR EACH file in FileList('$(vFilePath)');
//In order to get the file information from SQLtables command making use of the OLEDB connection
OLEDB CONNECT32 TO [Provider=Microsoft.ACE.OLEDB.12.0;Data Source='$(vFilePath)';Extended Properties="Excel 12.0;HDR=YES";];
tables:
SQLtables;
DISCONNECT;
FOR i = 0 to NoOfRows('tables')-1
LET sheetName = purgeChar(purgeChar(peek('TABLE_NAME', i, 'tables'), chr(39)), chr(36));
Concatenate
Table:
Load
FileBaseName()as FIle,
FileDir() as Dir,
FileName() as File_Name,
'[' & '$(sheetName)' & ']' as Sheet_name
From $(file)(ooxml, embedded labels, table is [$(sheetName)]);
NEXT i;
Next
NoConcatenate
//Drop sheetnames we dont need/cant use
tables2:
Load Sheet_name Resident tables
Where not MixMatch(Sheet_name, '[Sheet1]'); //<< Incomplete Sheets
Drop Table tables;
//Loop through the sheetnames in the file and load our data
FOR EACH Sheet in FieldValueList('Sheet_name')
MyFile:
LOAD field1,
field2,
field3,
'$(Sheet)' as Sheet,
Year(Date(Filetime())) as filetime,
right(FileDir(),4) as Year
FROM
[$(vFilePath)]
(ooxml, embedded labels, header is 2 lines, table is $(Sheet));
next;
Drop Table tables2;
Introduction
The purpose of this document is to introduce a set of tools to generate maps for the UK to be used in QlikView through an extension. There are already other solutions that address this but I couldn’t find any free; now google has also changed the access to their api so also that isn’t free anymore.
Through this document I’ll explain how to generate KML maps at various levels of detail to be then used with the quickmap extension created by Brian Munz (https://github.com/brianwmunz/QuickMap-QV11); this extension uses openstreetmap so it is free. Also, I’m providing an example to show how to implement it.
High level overview
There is a wealth of data available in the UK; I started this project when analysing the house price paid data available from the Land Registry. The data is very well structured and every transaction has an associated postcode, so that is the starting point for the visual representation: reference lookup data is available with postcode coordinates as well as the details of which geographical areas it belongs to.
There are multiple geographical levels in the UK and KML maps are available to download; the challenge with these maps is that they are extremely large to be loaded in one single instance by the extension. What I’ve done is to break down these maps in smaller areas and to establish two drilldown paths:
Different drilldown paths can be set up in the application and the relevant maps generated.
I’ve created 4 QlikView applications:
I’ve included within the applications all the relevant instructions on how to use them.
The whole package is included in the zip file attached; I’ve also included an amended version of the quickmap extension: I’ve added some extra functionalities (ability to define max and min values and possibility to use a color gradient with smooth transitions) so please use this version. Within the qar file I’ve also already embedded the maps required by the example application.
Sample drilldown screenshots
In this example you can see that, as you select the data and drill down, different sets of maps are used for each level that contain progressively more details.
England and Wales district overview:
St Albans District:
MSOA Level - St Albans 002
LSOA level - St Albans 002C
OA level (coordinates) - OA E00120299
I hope someone might find this helpful. I would appreciate any comments or feedback.
Recently I have heard this question several times, therefore it might be worth of writting something about it.
Since version 17.2, NPrinting has empowered its features to external programs by its RESTful APIs. User may easily control and get data from the system with the APIs now. It is a very popular scenario that user wants to start the NPrinting Publish task right after the data reload is done in the QlikView application, so that the published report will have the latest data.
There are several possible ways to it. In this example you will find how to trigger and monitor the NPrinting task through QlikView load script, which requires nothing more than a QlikView Desktop and a Qlik REST connector, besides of the NPrinting environment. In the end of this article you will also see how to chain it with a QlikView document reload task in the QMC, so that the NPrinting task will automatically start right after the QlikView reload task is finished.
Some prerequisites for using this example:
And this article might be helpful with more information regarding to using NPrinting APIs.
The attached QlikView application in this example is quite self-explanatory. Here I will just list some key components for your attention.
The screenshots below show an example of creating a POST connection for authentication in Qlik REST connector.
Download the attached QlikView document and reload it from your QlikView server machine to test how it works.
At last, you may want to create a reload task in QlikView QMC and chain it to the reload task of your data application, so that the NPrinting publish task will be triggered automatically after the QlikView data reload task is completed successfully. You can find the settings from the screenshot below, or get help from this community article.
Thanks for your reading!
Hi all!
As consultant I´ve done several Dashboards using QlikView and in most of them I don´t have the need to incorporate maps, but if at least once you´ve tried to create a maps using just only the tools that are incorporated with the developer license (no extensions allow) you might realize that the layout is not so fancy so..... that’s why I am here to share with you how to create maps with more visual impact in 4 simple Steps.
Tools/skills that we will need:
STEPS
Determinate/think the regions that you are going to need/implement
Find a map (preferable in png format or vector) for the regions that you need. Examples:
Use your skills in Photoshop (don’t be afraid if it´s your first time)
Take time and patience to configure each section of the map
Don’t forget to select “Image” in the presentation field !
TIPS
FINAL VIEW
So the final layout for this project looks liek this. I have more tabrows (dependeing the selected region)
**********************************************************************
I hope you could find this article useful, to be honest using this method has some issues:
So check the pros and cos and decide if it is worth it!
I´ve tried Geoanalytics for QlikSense and is a beast with incredible capabilities and functionalities, 100% recommendable but for those who are still using QlikView I think might work as good as other alternatives
Any comment will be welcome, the main idea is to improve and share our skills together
(MENSAJE EN ESPAÑOL A CONTINUACIÓN)
Last status - 15/07/2018
QVW apps and original files with player predictions attached (RAR file)
---------------------------------------------------------------------
Último estado - 15/07/2018
Se adjuntan las aplicaciones QVW y los ficheros originales con las predicciones de los participantes (fichero RAR).
----------------------------------------------------------------------------------------------------------------------------------------------------
(MENSAJE EN ESPAÑOL A CONTINUACIÓN)
Hi guys,
There are only 52 days left until World Cup 2018 (https://fifa.com/worldcup) starts. Who will be the World Champion? Will the Cup stay in Europe, stay in America or go to a new continent?
If, in addition to Qlik, football (soccer) is one of your passions, take part in this competition of the 2018 World Cup:
Players will win points following these rules:
1 point per successful score of each match (eg Portugal 1 - 3 Spain)
20 points for guessing the team that remains 4th of the World Cup
It is a free competition (of course) and without any material prize beyond demonstrating knowledge of football and having a good time. A QlikView application will be created in which the course of the competition will be shown (analysis of the participants, ranking, etc.). We can build these applications among all participants and so each one can contribute with their knowledge in: visualizations, scripting, etc.
I encourage you to participate in the competition, have a good time with fellow Community members and continue to learn Qlik, so ... fill in the excel and upload it to Dropbox to participate!
Of course, I will be attentive to any doubts that may arise, requesting improvements and suggestions.
Thank you very much and greetings,
HM
----------------------------------------------------------------------------------------------------------------------------------------------------------------
Hola chic@s,
Solo quedan 52 días para que el Mundial de 2018 (https://es.fifa.com/worldcup) empiece. ¿Quién será el campeón mundial de fútbol? ¿Se quedará en Europa, se quedará en América o irá a un nuevo continente?
Si, además de Qlik, el fútbol es una de tus pasiones, anímate a participar en esta competición del Mundial de 2018:
Las puntuaciones serán las siguientes:
Es un torneo gratuito (por supuesto) y sin ningún premio más allá de demostrar el conocimiento de fútbol y de pasar un buen rato. Se creará una aplicación QlikView en las que se muestre el desarrollo de la competición (análisis de los participantes, ránking, etc.). Estas aplicaciones las podremos construir entre todos y así cada uno podrá aportar su conocimiento en: visualizaciones, scripting, extensiones, etc.
Os animo a participar en la competición, pasar un buen rato con compañeros de la Community y seguir aprendiendo Qlik, así que.. ¡rellena la excel y súbela a Dropbox para participar!
Por supuesto, estaré atento a las dudas que surjan, petición de mejoras y sugerencias.
Muchas gracias y un saludo,
HM