Qlik Sense documentation and resources.
Mapping data based on postal code areas or center points is a common task. However postal codes is not a one global standard and there is lot of variation between countries in format and structure that makes the task challenging.
Also the availability of postal code areas varies, many countries are included in the Qlik Location database, but for others postal code areas need to be purchased separately.
Here's the high level workflow, the bulk is data preparation before rendering
1. Prepare the data
- Make the postal codes unique
- Pad with zeroes
- Some countries are special
- Check coverage
- Build overview layers when possible
This app will show how to map postal code areas for five European countries: France, Great Britain, Germany and Italy. The indata is not clean and contains errors like in most uses cases.
Make the postal codes unique
When mapping postal codes for several countries the first step is to make the postal code unique. The easiest way to do this is to add country code prefixed with a comma after the postal code.
=Postal & ',' & CountryCode as zip
Pad with zeroes
In many cases Excel drops leading zeroes in numerical postal codes. I can see that when looking at the max and min lengths. Note that not all postal codes are numeric though and UK is special so we fix that later. Pad with leading zeroes for France, Germany and Italy like this:
The postcodes are alphanumeric, and are variable in length: ranging from six to eight characters (including a space) long. Each post code is divided into two parts separated by a single space. More info at Wikipedia. Here's the three levels comparable to 2,3 and 4 digit numeric postal codes:
There´s a common request to any Qlik Sense Enterprise Administrator:
"Is there any way to be informed by e-mail when a reload task fails?" Yes, there is!
Before that, let´s have some technical small talk (you can jump to session "Creating your config xml" if you want)
Qlik Sense Enterprise relies on log4Net a very mature Apache Project that is very reliable and powerful. It, besides a lot of features , has the concept of Appender. That concept means that the application being logged (Qlik Sense in our case) does not to take care where the log is being stored, log4net config files handles this.
You can change a config file and choose if your log are meant to be stored into a database, a flat file, a hadoop data lake and you can even send an e-mail for each error message. Qlik Sense Enterprise help to Appenders
1 - Creating your config xml
Pick your favorite text editor and write your config file according this https://logging.apache.org/log4net/release/config-examples.html
No, wait!
I´m posting here a functional config file, you can copy and paste it.
I´m supposing you´re going to use Gmail as the SMTP but is up to you chose any working SMTP that you want
<?xml version="1.0"?>
<configuration>
<!-- Mail appender-->
<appender name="MailAppender" type="log4net.Appender.SmtpAppender">
<filter type="log4net.Filter.LevelRangeFilter">
<param name="levelMin" value="WARN" />
<param name="levelMax" value="ERROR" />
</filter>
<filter type="log4net.Filter.DenyAllFilter" />
<evaluator type="log4net.Core.LevelEvaluator">
<param name="threshold" value="ERROR"/>
</evaluator>
<param name="to" value="admin@domain.com" /> <!-- please use your admin e-mail address here -->
<param name="cc" value="someone.else@domain.com" />
<param name="from" value="yourgmailaccount@gmail.com" /> <!-- you´re supposed to change this -->
<param name="subject" value="Qlik Sense Task Fail!!!" />
<param name="smtpHost" value="smtp.gmail.com" />
<param name="port" value="587" />
<param name="EnableSsl" value="true" />
<param name="Authentication" value="Basic" />
<param name="username" value="yourgmailaccount@gmail.com" /> <!-- you´re supposed to change this -->
<param name="password" value="yourpasswordforGmail" /> <!-- you´re supposed to change this -->
<param name="bufferSize" value="0" />
<!-- Set this to 0 to make sure an email is sent on every error -->
<param name="lossy" value="true" />
<layout type="log4net.Layout.PatternLayout">7
<!-- You can change below code if you want to pretify your e-mail message -->
<param name="conversionPattern" value="%newline%date %-5level %newline%property{TaskName}%newline%property{AppName}%newline%message%newline%newline%newline" />
</layout>
</appender>
<!--Send mail on task failure-->
<logger name="System.Scheduler.Scheduler.Slave.Tasks.ReloadTask"> <!-- Filter what kind of exceptionname is beeing captured -->
<appender-ref ref="MailAppender" />
</logger>
</configuration>
2 - Saving your config xml
You must save your file with LocalLogConfig.xml name and under C:\ProgramData\Qlik\Sense\Scheduler (at least until version April2018). Anything different from this will cause your configuration not work.
3 - Restarting your Scheduler Service
Check if any task is running before restarting the service!!
4 - Test
Get a cup of coffee and wait for your task fail and your alert e-mail arrives at your inbox.
This blog post was written with fundamental contribution of ile that sent me a functional config file
This is a short summary of how I connect to my corporate Active Directory and extract employee records and group memberships for theses employees. This is the most time efficient way I have found that works in my environment, but there might other query options.
NOTE: The below example connects to domain.local. Please change these references to your domain name.
First step is to define a connection in Qlik Sense. I have found that OLEDB and ADsDSOObject provider works. For server side app deployment, make sure the service account is allowed to connect to the AD, or add name and password or a authorized user in the connection settings.
OLEDB CONNECT TO [Provider=ADsDSOObject;Integrated Security=SSPI;Data Source=ldap://domain.local;]
In the load script connect to the directory with a LIB CONNECT string.
LIB CONNECT TO 'LDAP Domain.local (tko)';
The below load statement extract all person records from AD. In the SQL query define the field names exactly as they are named in your AD. Notice, the FROM definition requires the LDAP URL to your domain.
ActiveDirectory:
LOAD *
WHERE Not(userAccountControl bitand 2); // Exclude inactive accounts
SQL SELECT
co,
department,
division,
employeeType,
givenName,
l,
mail,
manager,
mobile,
pager,
sn,
sAMAccountName,
telephoneNumber,
title,
userAccountControl,
displayName,
streetAddress,
postalcode,
st,
physicalDeliveryOfficeName
FROM 'LDAP://domain.local'
WHERE objectCategory = 'person' AND memberOf = '*'; // The * wildcard returns all records in Persons's objectCategory
Next step is to extract a list of all the existing groups in AD. Notice that your domain name has to be referred both in the FROM and WHERE in the SQL query.
ActiveDirectoryGroups:
LOAD
name AS GroupName,
name ,
distinguishedName;
SQL SELECT
name,
distinguishedName
FROM 'LDAP://domain.local'
WHERE objectCategory = 'CN=Group,CN=Schema,CN=Configuration,DC=domain,DC=local';
Extracting all members of each group enables pairing groups with their exact members. Each group is queried to extract the member list.
FOR i = 0 to NoOfRows('ActiveDirectoryGroups')-1
// Pick next group name and distinguished name
LET vGroupName = Peek('name' , $(i), 'ActiveDirectoryGroups');
LET vDistinguishedName = Peek('distinguishedName' , $(i), 'ActiveDirectoryGroups');
TRACE $(i) of $(#vNoOfRowsActiveDirectoryGroups) : $(vGroupName);
// Load person identifier from the current distinguished group
// The result is stored in Temporary table
// Each iteration is added to the same Temporary table
TmpSamAccountNameGroups:
LOAD
sAMAccountName,
'$(vGroupName)' AS GroupName
;
SQL SELECT
sAMAccountName
FROM 'LDAP://domain.local'
WHERE objectCategory = 'person' AND memberOf = '$(vDistinguishedName)';
NEXT
The extracted members are left joined to employee table, to map employees with their group membership. With the result of one table showing all employees and their AD group membership
// Left join to exclude sAMAccounts from group view that are not active
Left Join ('ActiveDirectory')
LOAD *
Resident TmpSamAccountNameGroups;
// Drop temporary tables
DROP Tables TmpSamAccountNameGroups, ActiveDirectoryGroups;
Welcome to the QS CSS MasterClass.
Motivation:
In my career as a PreSales, I need to create quite a few "user-appealing-applications". Sometimes users request a specific design or specific functions I need to implement in Qlik Sense. As we all know, Qlik Sense is built for simplicity & self-service and sometimes it could be challenging to achieve the desired result. But through the last couple of years, there were a lot of tips and tricks around using CSS to create completely new designs and functions to implement a better information design concept.
This is where the MasterClass starts. I have created an application that gathers a couple of these tricks and explains them more in detail. In addition to that, it is very easy to understand because you can see the result directly within a Qlik Sense App. If you think this sounds interesting, take a couple of minutes and join my short tour through the app. I won't cover all aspects in detail. This tour will give you an overview of the documented and used tricks within the app.
Content:
The following list will give you a brief overview of what topics are focused on the specific sheet within the app.
Sheet | Description |
| You can add CSS definitions to your app by using themes or by using so-called "helper- objects". On this sheet, I will explain when you should use which option. |
A "helper-object" carries and injects the CSS definition on a specific sheet. This sheet helps you to hide this object and which objects can be used for it. | |
| This example demonstrates how to add better guidance to your dashboards by segmenting your background. |
| Sometimes selections can be mandatory to consume a dashboard, or the creator likes to guide the user through the filter pane by using colored filter boxes. I show you how. |
How to hide objects like the selection bar or elements within context-menus in case these functions shouldn't be used in the app or on this sheet? | |
Adding background-pictures to your dashboard can spice up the overall flavor of your dashboards. This can be used for segmentation or just to add some style. | |
In this section, we will completely change the look and feel of a straight- and pivot table. | |
Let's create an illusion by just moving our objects closer together. After that, I'll look like we just have one. | |
This sheet explains the easiest way to implement your own font by using a custom theme. It's just a few lines of code. | |
No matching grid? This trick shows you how to create your own grid for a specific sheet by changing the metadata through Qlik Engine API Explorer. |
Installation:
Under "Attachments" you can find the required package. The zip package includes a qvf file (Qlik Sense - CSS MasterClass V 1.0.qvf) and an extension (ShowHTMLfromMeasure). After importing the extension and app we need to change a quick configuration because the app has a different ID on your system now.
Open sheet called "Using Background Pictures". Normally this sheet has a background image. This gets referenced over the internal app-ID. Click on "Edit Sheet" and select the displayed CSS box ("CSS config HERE"). Navigate to submenu "Styles" and change the used app-ID in "Styles (CSS)" to your app-ID (displayed in the URL). Now you should see a background picture on this sheet.
Usage:
Every sheet has the same structure. First, a description explains the trick in general and how it's working. On most of the sheets, you also get the explanation for the used CSS selectors. On the right-hand side, you can see the used code in a black code box. You can't copy the code from here. To do so enter the "edit-mode" and click on the "CSS config HERE" button. This object is always the "Helper-object" that carries the used and explained CSS code. Navigate to the "Styles" submenu and copy the code from the "Styles (CSS)" window. I recommend using an external editor to modify or review the CSS code.
In case you like to transfer the trick to your dashboard you just need to change the object ID. If you don't know how to find the object ID this will be explained on sheet "Hide (Helper-) Objects".
First of all, I'm very grateful and amazed by Qlik Sense and the Qlik community. I think I've learned so much more in the last couple of months than in the last couple of years.
Specific thanks to Fabrice Aunez from whom I took the idea for Time Analysis and just adapted it to my requirements.
Many thanks to Henric hic for all his posts and comments. Legend!
Ok, to the point, the intention here was to streamline the creation of a Master Calendar and time dimensions based on one or more than one (if required) important dates on your data model.
This subprocess creates several tables that should facilitate set analysis formulas for Calendar, Financial and even Seasonal analysis such as
And few auto calendar fields for the given date or dates compared against today's date:
I created this routine mostly for Australian organisations but then adapted it to allow for other configurations, although not a great deal of testing has been done to be used in other countries, so it'll be up to you to check.
At the moment, the seasonal analysis is done only for Australia, where the Seasons are:
I imagine that depending on the feedback, I might redo the code to allow for other configurations of Seasons.
I haven't had the chance to work with Qlik View yet, only with Qlik Sense and the main driver for our developments in to empower users to do ad-hoc analytics, which means we need to make it a bit easier for them and with this data model we were able to create all measures that require time analysis as Master Items
I started with Qlik Sense 4 months ago and I only know few functions, so when you look at the code and think 'hmmmm.... there's got to be a better way to do this...', you'll be right!!!! I just haven got there yet...
I hope you find this useful. Many thanks for any constructive feedback!
Kind regards,
Luis
Hi Team,
This document can help about configuration for Task failure notification in Qlik Sense
Connector installation and configuration
This part only needs to be done once on all Qlik Sense nodes per environment.
Prerequisites: Anonymous authentication to the <Company> SMTP relay has been opened.
1. Download the Qlik Web Connectors March 2020 Patch 1 (currently installed) from the Qlik download site.
Go to www.qlik.com to log on with a registered customer or partner account, select Services > Customer Downloads, and then select the Qlik Connectors tab.
2. Before you unzip the file, check that the folder is not blocked by Windows security. To unblock the zip file, right click on the file, select Properties, then click on Unblock in the General tab. Click Apply.
3. Unzip QlikWebConnectorsX.X.zip (where X. X - is the version name) and place the unzipped files in a folder on the root directory of one of the nodes Windows server D:QlikWebConnector
4. Start up the Web connectors server in manual mode to configure it: open 😧 QlikWebConnectors\QlikWebConnectors.exe
If successful the following window will open.
5. Proceed to http://localhost:5555/web to configure the SMTP connector. Qlik SMTP connector is available on Connectors / Standard page. Click on the connector name to go to the connector Parameters page.
Fill in the details:
SMTP Server: Talk to windows team
Port: 25
SSL Mode: None
To: [Your Email Address to test]
Subject: Qlik Sense Failure Notification
Message: Qlik Sense Says Hi !!
HTML ON
From Email: Your From Address
Send a test email by clicking on Save Inputs & Run Table. You can see the result in Data Preview section and must receive the test email
6. Close the Web Connectors Page and the Qlik Web Connectors window to stop the connectors server.
7. Edit "Install Service. bat" file in D:QlikWebConnectors, Batch Files to configure the connectors as a service, so they auto start up with the Windows as follows:
REM *****************************
REM IMPORTANT - PLEASE READ:
REM * You should ensure than you have 'unblocked' the Qlik Web Connectors download zip file BEFORE unzipping it and running this.
REM * You need to change the following paths to match your .NET installation directory and QlikWebConnectorsSVC.exe location.
REM * You should run this file as Administrator (right click, 'Run as administrator').
REM *****************************
"%windir%\Microsoft.NET\Framework\v4.0.30319\installutil" "D:\QlikWebConnectors\QlikWebConnectorsSVC.exe"
pause
8. Save the batch file… Right click the Install Service bat file and select Run as administrator.
a. The batch file installs the Qlik Web Connectors as a Windows Service.
when the batch file finishes, press any key to close the command window
you should see an entry for Qlik Web connectors like
9. From the Windows start menu, type Services to open the Services dialog.
a Right click on the Qlik Web Connectors service entry
b. In the Qlik Web Connectors Properties dialog, click on the Start under the General tab to start the service.
Ensure that the Service status is Running.
Warning: Close the Click Web Connectors before you complete this step. If they are running when you press start, you will get a message saying that the service has started and then stopped.
c. Select the Recovery tab and set First failure to Restart the Service and set the time for Restart service after to zero (0) minutes.
Apply the changes and click OK.
10. The Qlik Web Connectors should be available at http: //localhost:5555/
11. Install and configure the connectors on all other Qlik Sense nodes. The easiest way to do it:
a. Stop the service and copy the folder D:QlikWebConnectors, to all other Qlik Sense nodes
b. Register the service on each of the nodes (steps 8-10).
c. Start the service which has been stopped at step (a).
12. Create new connection in every Qlik Sense environment not the nodes.
i. Data load editor - Create new connection - REST
ii. URL: Can be found from 5th point
iii. Authentication Schema: Anonymous.
iv. Skip server certificate validation: ON
v. Query parameters:
1. to: your emal for test
2. subject: test
3. message: test
vi. Name: SendEmail
vii. By default the connection name created via the Data load edit will be something like "SendEmail (<usemame>)". Remove the user name from the connection name via QMC.
Till above the connection setup is done.
How to setup the email alert for existing dashboard and newly creating dashboard?
1) Let's say, I have an Qlik Sense application called QSAppName.qvf {QSAppName is Dashboard Name} and this dashboard created task either daily/weekly/monthly etc.
2) To create alert, Again one application required
a) Let's say, I've created an application called QSAppName_Failure_Notification.qvf
b) Go to Script / data load editor, And add the following script
3) Create task for "QSAppName_Failure_Notification.qvf" on dependency of failure to "QSAppName.qvf"
Bingo, You achieved very powerful feature enjoy with your email's. I love to hear some question's, If any using Comment box
Regards,
Anil Samineni
During a workshop, the most tedious part is to assign a QlikCloud user into the workshop space. With Qlik Application Automation , this process can be automated from the creation of the space to the assignment of the user into the specific space.
First, I created a google sheet that contains the list of users email, and 1 sheet correspond to 1 workshop.
I used the List Sheets Of Spreadsheet block to list out all the sheets within this workbook
Then I used the Condition block to check whether the sheet is prefix with *. This provide a mechanism to skip any sheet by renaming the sheet.
If it is not prefix with *, the Create Space block will create a space for each sheet using the name of the sheet. You will need to set the type of share and most importantly, set on error to ignore and continue automation and ignore errors. This will ignore this block when the space has already been created.
Once the space is created, the Search Spaces block is used to retrieve the ID of the space as the space assignment is by the ID
Next, the Get Data From Sheet With Headers block is used to retrieve the list of users email from the google sheet
For each user email, the Search Users block is used to retrieve the ID based on the email address.
With both the Space ID and User ID, the Add Member To Space block is used to assign the user into the space. Note the type of roles to assign to the user and the type of member. Also set the on error to ignore and continue automation and ignore errors in case the user has not signed up yet. This will ignore this user and move to the next user.
The following is the entire flow of the Qlik Application Automation
This can then be scheduled to run everyday and possibly more frequently just before the start of the workshop.
[Paragraph format. Descriptive text here. See Content Guidelines for details. You can remove sections that are not needed for your article. If subheaders are needed, use heading 3 or heading 4 formats.]
By using a Multi-KPI object we can inject custom CSS onto the sheet to override the default limitations on pivot & straight tables
Step1:
Drag a Multi-KPI object onto the sheet and add a dummy measure example = 1;
Next, on the Properties panel of the Multi-Kpi object navigate to Appearance >> Styles
Add CSS here
Step2:
Paste the custom below
For Straight Tables add the below CSS snippet in the Styles(CSS) box
.qv-st .qv-st-value .qv-st-value-overflow {
max-height: none;
}
For Pivot Tables add the below CSS snippet in the Styles(CSS) box
.qv-pt .cell .value {
max-height: none;
white-space:pre
}
As shown below now each cell will fit to content of the cell based on line breaks in the content
Output
Cheers
Vineeth P
Trouble maintaining your Master Items in Qlik Sense? Look no further, GINQO's Master Item Manager has been further improved. Master Items can be difficult to maintain as you create more of them, which is risky for your development. To solve this issue we have enhanced the GINQO Master Item Manager to help you work even more effectively with Master Items.
An improved UI as well as better error handling and Drill-down support makes this a great option for your development. Additionally all Dimensions/Measures are now tagged by the Extension, making it even easier to maintain and manage them.
Try our Master Item Manager extension (https://github.com/GINQO/Master-Item-Manager). Comments and Feedback are always welcome. Additionally, let us know if you would like to contribute to the open-source community.
What is it?
Why was it built?
Who is it for?
Qlik Geocoding is a subscription service for address to coordinate lookups and vice versa. The service is hosted, address and key get sent to remote servers for lookup. The subscription has a limited amount of lookups and is valid for one year. Qlik Geocoding is an add-on service to Qlik GeoAnalytics Base and Qlik GeoAnalytics Enterprise Server. The service is provided through the Qlik GeoAnalytics Connector using the operations “AddressPointLookup” and “PointToAddressLookup”. Forward geocoding is address to point conversion. The result includes information about match level and location structure. Reverse geocoding is point to address conversion. Useful for example converting GPS coordinates to an location.
To get started
Below you will find templates for use together with the Qlik GeoAnalytics connector for Qlik Sense and QlikView. And also a template for GeoOperations (to be used in QCS and QSEoK).
(Qlik Sense Server only, Qlik Sense Desktop has no Repository)
There are situations where you would like to know something from the system configuration (QMC) from inside of a load script, something like "Which custom attributes does my app have assigned?" Or "Which stream is the current app in?" I've seen the "bad" idea to connect to the Postgre repository database and read from the internal table structure. This internal structure may, and is likely to, change, so I would not build on this hack.
The correct way to do it is to use the QRS API (Qlik Repository Service). In theory, the QRS API operates on the https port 443 as well as on port 4242. If you used the latter, you have to present the public key of the server certificate, which cannot be done from the load script (remark added 2020: the REST Connector meanwhile allows to add a Certificate File and Key File, so you can setup a connection directly to port 4242 when allowed from a firewall point of view).
In order to use the port 443, you need to authenticate as a privileged user towards the server - your best bet from the script is to use the header authentication (pass the userid via http-header) and this requrires the setup of a Virtual Proxy on the server by a QMC admin.
Step 1 - Setup Virtual Proxy
We want the script to be able to query the QRS api, so we have to give it a new "door" to authenicate. Set up a new Virtual Proxy like this: There will be a static pseudo user directory which we call "APIUSERS", whereas the user id itself will be provided in a http-header.
Dont forget to
Note: We will have to come back to QMC/Users once again after making the first call using the header.
Step 2 - Test the qrs endpoint with header authentication
It is literally impossible to go straight to the REST connector from the script if you haven't tried the results from a tool that shows all response details (return code, possible error messages etc). My tool of choice is "Postman" (the app, not the Chrome-Plugin of the same name)
Lets use the server url + /header/ (this is the virtual proxy we created) + /qrs/about + &xrfkey= and a 16 digit combination, which you repeat in the http-headers under the key "X-Qlik-xrfkey" again. Also provide the userid as a 2nd http-header
You should get an answer like above in Json format. This means, you made it through the virtual proxy. A new user has been added to the QMC, namely the id "scriptload" (coming from the http-header) with the directory "APIUSERS" (coming from the virtual proxy setting).
Go back to the QMC and provide the new user the Role "AuditAdmin"
Now you can also query some details about the app. Try the following request, before we finally go to the Load Script. -.../qrs/app/full should provide all information about apps, including possible tags.
In order to get the details about one specific app you should use the "filter=" query-string, the syntax is like the Json object key, a comparison operator, and a constant. In my case: id eq <app-guid>
eq is "equals", don't use "=" which is a reserved character already used at "filter="
If this works fine, lets edit the app script.
3 - Create REST Connector call from Load Script
Go to the app's dataload editor and create a new connection of type "REST".
The principle I follow here is:
Then load the data using the "Select Data" button in the Data connections list:
You can expand the Json response tree and deselect what you don't need. Be aware that each tree branch, which expands, will become a separate script table, so be conservative with what to load using a "need to know" judgement.
Insert the script. You will see the block called "RestConnectorMasterTable: SQL SELECT ..." and then several LOAD statements (in our case 4, one for the root and 3 childs which we selected above)
Until this point, the script is fully static. It only loads exactly this QRS endpoint and always load all apps.
Now lets bring some dynamics in. Find the last row of the first block where it reads FROM JSON (...) "root" PK "__KEY_root";
Before the semicolon insert (in a new line) WITH CONNECTION ()
With this parameter you can soft-code and overwrite any hard-coded entry from the "Edit Connection" dialog. If you omit a parameter in WITH CONNECTION, the original hard-coded entry from the Edit Connection dialog will be taken. You can also add more parameters using WITH CONNECTION. Use the WITH CONNECTION script as follows:
Here is my example of getting the info just for the current app:
Add this row before the "LIB CONNECT TO ..." statement:
LET vAppId = DocumentName();
This will assign the app GUID to the variable.
Modify the last row of the first block as follows:
FROM JSON (wrap on) "root" PK "__KEY_root"
WITH CONNECTION (
URL "https://qmi-qs-aai/header/qrs/app/full"
,QUERY "filter" "id eq $(vAppId)"
);
Note: the variable $(AppId) is slided into the filter. QUERY xrfkey;
When you execute the script, you will get the app info from the QRS API loaded as a table datamodel:
Now you can use some of your scripting skills to get, for example, to check which tags the given app has and do something:
FOR v = 1 TO FieldValueCount('name_u0')
LET vTag = FieldValue('name_u0', 1);
IF vTag LIKE 'byCSW' THEN
TRACE --- this app has the tag of the tags ---;
END IF
NEXT v
More Advanced Script Example or QRS API
In the following example I am loading the list of all tasks for a given app. For the tasks I also want a 1:n-List of Custom Properties (if any) and Tags (if any). The returned Json structure would be complex (8 tables for the qrs endpoint /qrs/reloadtask/full):
Deselect what you don't need. I also unchecked a lot of unneeded columns. You better spend a few minutes in the "select data" dialog of the Data Connector than in the script below. To see which fields I skipped I commented them out in the code below. I also introduced "nice field names" like "tag.name" or "task.id" instead of auto-generated names such as "name_u4", "id_u6" ...
Finally, I joined some tables to have only 3 relevant ones after this.
Here is the sample script:
LET vAppId = DocumentName();
LET vXrfkey = Left(KeepChar(Repeat(Rand(),10),'0123456789'), 16);
TRACE Onetime Xrfkey = $(vXrfkey);
LIB CONNECT TO 'QRS full app info (qmi-qs-aai_vagrant)';
RestConnectorMasterTable:
SQL SELECT
"id" AS "id_u6",
// "createdDate" AS "createdDate_u0",
// "modifiedDate" AS "modifiedDate_u0",
// "modifiedByUserName" AS "modifiedByUserName_u0",
// "isManuallyTriggered",
"name" AS "name_u2",
"taskType",
"enabled",
// "taskSessionTimeout",
// "maxRetries",
// "privileges" AS "privileges_u5",
// "schemaPath" AS "schemaPath_u0",
"__KEY_root",
(SELECT
// "id" AS "id_u0",
// "createdDate",
// "modifiedDate",
// "modifiedByUserName",
"value",
// "schemaPath",
"__KEY_customProperties",
"__FK_customProperties",
(SELECT
// "id",
"name",
// "valueType",
// "privileges",
"__KEY_definition",
"__FK_definition"
FROM "definition" PK "__KEY_definition" FK "__FK_definition")
FROM "customProperties" PK "__KEY_customProperties" FK "__FK_customProperties"),
(SELECT
"id" AS "id_u1",
"name" AS "name_u0",
// "appId",
// "publishTime",
// "published",
// "stream",
// "savedInProductVersion",
// "migrationHash",
// "availabilityStatus",
// "privileges" AS "privileges_u0",
"__FK_app"
FROM "app" FK "__FK_app"),
(SELECT
// "id" AS "id_u5",
"name" AS "name_u1",
// "privileges" AS "privileges_u4",
"__FK_tags"
FROM "tags" FK "__FK_tags")
FROM JSON (wrap on) "root" PK "__KEY_root"
WITH CONNECTION (
URL "https://qmi-qs-aai/header/qrs/reloadtask/full"
,QUERY "filter" "app.id eq $(vAppId)"
,QUERY "xrfkey" "$(vXrfkey)"
,HTTPHEADER "X-Qlik-Xrfkey" "$(vXrfkey)"
// ,HTTPHEADER "userid" "scriptload"
);
[root]:
LOAD
[id_u6] AS [task.id],
// [createdDate_u0] AS [createdDate_u0],
// [modifiedDate_u0] AS [modifiedDate_u0],
// [modifiedByUserName_u0] AS [modifiedByUserName_u0],
// [isManuallyTriggered] AS [isManuallyTriggered],
[name_u2] AS [task.name],
[taskType] AS [task.taskType],
[enabled] AS [task.enabled],
// [taskSessionTimeout] AS [taskSessionTimeout],
// [maxRetries] AS [maxRetries],
// [privileges_u5] AS [privileges_u5],
// [schemaPath_u0] AS [schemaPath_u0],
[__KEY_root] AS [__KEY_root]
RESIDENT RestConnectorMasterTable
WHERE NOT IsNull([__KEY_root]);
LEFT JOIN
//[app]:
LOAD
[id_u1] AS [app.id],
[name_u0] AS [app.name],
// [appId] AS [appId],
// [publishTime] AS [publishTime],
// [published] AS [published],
// [stream] AS [stream],
// [savedInProductVersion] AS [savedInProductVersion],
// [migrationHash] AS [migrationHash],
// [availabilityStatus] AS [availabilityStatus],
// [privileges_u0] AS [privileges_u0],
[__FK_app] AS [__KEY_root]
RESIDENT RestConnectorMasterTable
WHERE NOT IsNull([__FK_app]);
[customProperties]:
LOAD
// [id_u0] AS [id_u0],
// [createdDate] AS [createdDate],
// [modifiedDate] AS [modifiedDate],
// [modifiedByUserName] AS [modifiedByUserName],
[value] AS [customProp.value],
// [schemaPath] AS [schemaPath],
[__KEY_customProperties] AS [__KEY_customProperties],
[__FK_customProperties] AS [__KEY_root]
RESIDENT RestConnectorMasterTable
WHERE NOT IsNull([__FK_customProperties]);
LEFT JOIN
//[definition]:
LOAD
// [id] AS [id],
[name] AS [customProp.name],
// [valueType] AS [valueType],
// [privileges] AS [privileges],
[__KEY_definition] AS [__KEY_definition],
[__FK_definition] AS [__KEY_customProperties]
RESIDENT RestConnectorMasterTable
WHERE NOT IsNull([__FK_definition]);
[tags]:
LOAD
// [id_u5] AS [id_u5],
[name_u1] AS [tag.name],
// [privileges_u4] AS [privileges_u4],
[__FK_tags] AS [__KEY_root]
RESIDENT RestConnectorMasterTable
WHERE NOT IsNull([__FK_tags]);
DROP TABLE RestConnectorMasterTable;
DROP FIELDS [__KEY_definition], [__KEY_customProperties];
Alternative starting map - Districts:
Multiple selection of areas (available at all levels) - EER level:
District level:
MSOA level:
OA level:
Have you ever noticed, when you have the same app open twice and make selections, those selections are reflected in all your open tabs? This is because they are all open using the same session/identity.
This has been one of the most requested features by our users as they like to compare things side by side, without changing the selections every time, or having to build charts with alternate states.
Luckily, there is a way to bypass this using the Qlik Single integration API.
The Single Integration API provides parameters that can be used to create an URL that returns a complete HTML page containing for example an embedded Qlik Sense visualization. This URL can be embedded in a web page, for example by including it in an iframe.
Here's how :
Voilà! You are now able to make different selections for the same app!
If you are reading this and perhaps don't even know what a Dynamic View is, don't worry, you are in good company. It's been perhaps one of Qlik's best keep secrets.
Not sure why so many have kept it a secret but you know I'm an open book. I want you, or specifically your end users, to have access to all of the goodies and Dynamic Views is certainly one of those.
If you do know what Dynamic Views are then stay here and watch the video to see that Dynamic Views are still very much supported in Qlik Sense Enterprise SaaS. In fact, if anything you may see that the code to implement has gotten even easier. Then feel free to read the blog I wrote for DataOnThe.Rocks.
If you are not familiar with Dynamic Views then read the blog first. That way before you start thinking about implementation you understand WHAT Dynamic Views are and WHY I think so highly of them and what they offer your end users. Once you understand the concept and the consumption then the video is your next step.
Monitoring current performance and availability of Qlik Sense site is not out-of-the-box feature, but can be easily achieved with native REST connector and requesting engine\healthcheck JSON API.
There are other tools and approaches, but since I already have the best BI tool available 😉, I found it reasonable to use it. All that is necessary is just:
This Qlik Sense app:
It's recommended to deploy and schedule this app on DEV or TEST site, pointing to all production nodes. Otherwise, there will be no data loaded in case the production engine (or the whole server) crashes.
(just FYI - DEV and TEST sites are included in the professional license subscription)
It's also recommended to read the official help site info about the engine API before you deploy this app.
After importing the app to a suitable site, follow these steps:
1. Decide on target folder for result QVD files, where parsed snapshot of JSON data will be stored. Create a folder Data Connection (if it doesn’t already exist) and set its Lib name to vQVDpath variable in Settings script section.
2. Create a REST Data Connection to each node:
URL: https://<server domain>/engine/healthcheck OR https://<node IP address>:4747/engine/healthcheck
Authentication Schema: Windows NTLM
User name & password: type root admin or service user credentials
check "Skip server certificate validation" if there is no valid certificate on target node
Query headers:
User-Agent: Windows
Cache-Control: no-cache
3. Adjust the INLINE table in the Settings section of the script according to your site
node - name of the node that will be shown for app’s user
restConnection - name of the REST connection that points to node’s engine/healthcheck
RAM - amount of node's memory in GB
CPU - number of node's logical processors
4. Schedule a reload task. The recommended app reload frequency is 5 minutes or less (tested reload duration on the 4-node site is 12 seconds). The script will return "N/A" in [isSaturated] field when the engine is not responding.
The current roadmap includes - script for email notifications, possibility to load only N days back, mapping app name to IDs, node comparisons.
PLEASE! If you make adjustments or additions to the script or front-end visualizations, please share them back on the Qlik Community.
Happy Qliking 🙂
-RADO-