Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
This forum is where any logged-in member can create a knowledge article.
Hello Qlik Community,
As we approach the Paris 2024 Olympics, I have been posting on LinkedIn some facts and stories about the results from past Summer Olympics editions.
Everything has been done with Qlik Cloud without any extensions.
Here you will find the link to all the posts.
- Enjoy 🙂
Qlik Cloud Analytics has a very powerful and easy to use data loading and transformation framework. This includes no-code data flows and Qlik scripting for advanced transformations. However, there are times when a customer may prefer to handle these transformations as much as possible outside of Qlik Analytics. Maybe the data is being used by Multiple Qlik Analytics environments, in a multi-tenant or hybrid architecture. Or maybe the data set needs to be consumed by other tools in addition to Qlik Analytics. You may also be looking to provide ready to use tables for Self-service analytics. In these situations, it makes sense to move any logic earlier in the lifecycle to avoid duplication and so all tools can benefit from the transformations.
The aim of this guide is to provide some recommendations on how to model your database for minimal transformations, or transformation-free consumption by Qlik Analytics. We cover our top ten recommendations, that if followed reduce, if not eliminate the need for extensive transformations within your Qlik analytics applications. While some of these recommendations assume a database as the data source, most are applicable across all data sources.
There are many powerful tools in Qlik’s Data Integration portfolio that can be used to achieve this, such as Talend Studio, Qlik Talend Data Integration or Qlik Compose. Qlik Talend Data Integration can even be used to replicate tables to QVDs in Qlik Cloud using Change Data Capture (CDC). For more information on these solutions, see Data Integration and Quality. However, this document will focus on the end goals, rather than the tools used to achieve this.
I am going to start with what is a somewhat controversial, but I believe necessary step for many organisations that wish to minimise downstream transformations. Views in an RDBMS allow use to encapsulate all the other changes we will need to make into a virtual table than we can simply read from our Qlik Analytics app. There are a few assumptions behind this approach though that if not true may render this unnecessary or undesirable. These assumptions are:
Often when building a data warehouse, data lake or Lakehouse we tend to include all possible columns and data so it is available at a later data to meet changing requirements. However pulling this data into a Qlik Sense app is not recommended unless there is a use for it. This is because Qlik Analytics uses a very powerful in-memory associative model to manage your data and bringing unused data into memory will have a performance impact.
It is quite common for tables in databases to include standardised columns. For example: CreatedBy, CreatedDate, etc. These columns are useful for organisational governance and audit requirements, but will cause problems in Qlik Analytics as Qlik will associate tables based on the identically named columns. Assuming you can not exclude the columns, renaming them e.g. myTable_CreatedBy, myTable_CreatedDate, etc. will avoid this issue.
Qlik does not support composite (multi-column) keys in the same way databases do. When Qlik encounters a multi-column key it creates a synthetic key based on all the possible values for the keys. Synthetic keys use extra memory and can impact performance, as well as increasing the complexity of the analytics application. They are often not required and are due to factors such as standardised names (see #3). If there is not a natural key, you may be better to use an Autonumber or guid function to create a unique key. See Synthetic keys for more information.
Closely related to #4, Qlik expects primary and foreign key columns to have the same name. So if your primary key is person.id and your foreign key is role.person-id they will not join by default. The solution is to rename person.id to person.person-id.
There are no benefits from a highly normalised data model in Qlik. Qlik stores values in symbol tables so there isn't much extra data storage required for denormalised tables and extra joins impact performance. Therefore denormalising before it reaches Qlik removes the complexity of having to do this afterwards.
Closely related to #6, Qlik Analytics performs best against a Star Schema, and well against a snowflake schema. This is also true of many other query tools so transforming into one of these Schema styles in your database will provide value across your data landscape. These transformations from a normalised schema can sometimes be easier to make in SQL outside of Qlik Analytics than if they need to be hand-coded in Qlik Script. See star and snow & When do we use Snow Flake schema in Data Model?. And as you ideally only want a single fact table, you may want to consider concatenated fact tables (see Concatenate vs Link Table ).
when loading data from a source database, It is common to filter the data to provide just what is needed for your use-case. It is possible to filter any data source, however if this is done within a Qlik Analytics load statement, the data will first be read by Qlik before excluding that which does not meet the filter. This increases load times, network bandwidth and resource usage. If however, the filter is placed on a select statement, then this will be pushed down to the source and data excluded by the filter will never be sent to Qlik. In the following example, the end result is the same. In the former however the filtering is done once the data has been read by Qlik. In the latter, it is pushed down to the database and is never seen by Qlik.
If this filter is static, it would make sense to include it as part of the view to ensure self-service users do not load unnecessary data.
Analytics rarely involves drilling down to the individual transactions. Therefore often less precision is needed for analytics as is needed for an operational system. Rounding down (say from 8 to 4, or even two decimal places) saves space, reduces data transferred and improves performance. If you are doing this it makes sense to do it before it gets to Qlik and this could be embedded in a view. For example if I have a ‘rate’ field that allows 6 decimal points, but IO only need 2, I could use a statement like “cast(rate as NUMERIC(6, 2) )”. Doing this before it gets to Qlik will provide better consistency across your Qlik Landscape and provides better performance to your self-service users.
Remember when we said to exclude columns not relevant for analytics (#2) ? Well there is an exception to this and it relates to the need to understand when data was changed so you can perform incremental loads. Storing the last modified date allows us to only look at the data changed since our last reload, so will make a big difference to performance for incremental loads.
The suggestions on this list are not the only things that you can do to make your source database Qlik analytics ready however they deserve consideration as they can provide a lot of value. And this is not the only approach you can take – there may be good reasons for for taking a different approach in some of these areas, however at least considering these facors will help you to build an analytics ready database.
What other suggestions do you have? What has your organisation done to create a Qlik Analytics ready environment? Let me know in the comments!
A question that gets asked regularly is how to calculate a rolling measure over a period of N-months (or weeks or days). For example a 12-month rolling total or a 4-week rolling average. There are several ways to do this. But these approaches have some limitations that need to be kept in mind. I'll try to explain these approaches and some of their limitations.
First let's load some sample data. The SalesData table below will contain sales amount values for twelve consecutive months.
SalesData:
load * inline [
Month, Amount
1,6
2,4
3,7
4,3
5,4
6,9
7,5
8,7
9,8
10,6
11,9
12,7
];
This is a very simple table with little data, but this enough for demonstration purposes.
Once this data is loaded it's possible to create a straight table chart object to display the amount per month and a running total. As expected Month is used as dimension. The expression sum(Amount) will display the amount per month. Now let's add an expression to calculate a running total over three month periods.
This can be done in two ways. The first uses the Accumulation option for expressions. The same expression sum(Amount) is used, but now the Accumulation option is set to 3 Steps Back:
The second option uses the rangesum function. That expression looks like this:
rangesum(above(sum(Amount),0,3))
This sums the Amount value on current row and on the previous two rows. The resulting straight table looks like this:
This looks good. The rolling 3 months amount is calculated correctly. But what happens if a selection of months is made?
The rolling 3 month amount for month 4 is now 3 instead of 14. This is because month 1,2 and 3 are no longer included in the calculation for the rolling 3 month total.
The accumulation option has another issue. It only works when only one dimension is used in the straight table. The rangesum expression can be modified so it can calculate across dimension borders, but the accumulation option can't. The modified rangesum expression adds the total keyword to the above() function:
rangesum(above(total sum(Amount),0,3))
This goes some way to doing what we want, but the issue of displaying the wrong rolling 3 month amount for month 4 isn't solved yet. Contrary to what I first thought there is a solution for this, as Henric pointed out to me in the comments below. By combining the rangesum with the aggr function it's possible to calculate the correct rolling 3 month amounts for each month. The expression needed for that looks like this:
sum(aggr(rangesum(above(total sum({<Month=>}Amount),0,3)),Month))
Read Elif's blog post Accumulative Sums for a more complete explanation.
How about set analysis expressions?
This expression should calculate the sum of amount for the three month period:
sum({<Month={'>=$(=only(Month)-2)<=$(=only(Month))'}>}Amount)
But notice the only() function. This requires that only one month value is selected. After selecting month 4 the result looks like this:
This shows the selected month, but also the two previous months. And the values are not accumulated.
Ok, but what about the max function instead of only?
sum({<Month={'>=$(=max(Month)-2)<=$(=max(Month))'}>}Amount)
That gives a different result, but still not what we're looking for:
Now only the last three months are shown and again the values are not accumulated.
The 'problem' is that the set is calculated once for the entire chart, not per row. This means that it's not possible here to use Month both as a dimension and in the set modifier in the expression.
There's still an option left to discuss: AsOf tables.
The AsOf table links a period with all the periods in the rolling period. In this example months are used, but it can be applied to any type of period like hours, days or weeks.
For the three month periods needed for a rolling 3 month total this means a month should be linked to itself, the previous month and the month before the previous month. The only exceptions are the first month, which is itself the rolling 3 month period, and the second month that together with the first month is its rolling 3 month period. There are no months before the first month so the first two months cannot run over 3 months.
The AsOf table needed for the rolling 3 month calculations looks like this:
This table can be created like this:
AsOfMonth:
load
Month as Month_AsOf,
Month + 1 - IterNo() as Month
Resident SalesData
while IterNo() <= 3;
right join load Month Resident SalesData;
What this does is create three records for every month using the while statement. But that also creates three records for month 1 and 2. This would create a month 0 and a month -1. The right join is used to remove those incorrect month values.
Now that the AsOfMonth table is created the Month_AsOf field can be used instead of the Month field in the straight table. The expression for the straigh table is simply sum(Amount).
The straight table now shows the correct rolling 3 month total for month 4.
This can be expanded a little so not only the rolling 3 month can be shown, but also the amount for the month itself. To achieve this the AsOf table is modified by adding a field to label the type of period. And records are added to the table so each Month_AsOf value is linked to the matching Month value:
AsOfMonth:
load 'Current' as Type,
Month as Month_AsOf,
Month as Month
Resident SalesData;
Concatenate (AsOfMonth)
load 'Rolling 3' as Type,
Month as Month_AsOf,
Month + 1 - IterNo() as Month
Resident SalesData
while IterNo() <= 3;
right join load Month Resident SalesData;
There are now two types of periods available: Current and Rolling 3. Additional period types can be added for example for Rolling 6, Rolling 12 month and Year-to-Date periods. You can find examples of these types in the attached AsOf Table Examples.qvw document.
The period type can be used in the chart expressions to calculate the amount for the wanted period:
Current amount: sum({<Type={'Current'}>}Amount)
Rolling 3 month amount: sum({<Type={'Rolling 3'}>}Amount)
Concluding, there are two solutions that do what we want:
1. The rangesum-aggr combination
2. The AsOf table
The first has the advantage that no changes to the data model are needed. It's also possible to dynamically change the period to aggregate over by using a variable instead of a hardcoded number of periods. A disadvantage is that that it's a somewhat complicated expression that also comes with a performance cost.
The AsOf needs changes in the data model to create the AsOf table and fill it with the necessary records. The advantage is that it likely performs better on large data sets. It's also quite versatile since you can add several sets of records to meet different scenario's. The expressions you end up with in the charts are also less complicated. That said, it will likely take you some time to fully understand the AsOf table concept and realize all the places where you can put it to good use.
In the end you'll have to decide for yourself which solution is appropriate in your situation. With regards to the performance of one or the other solution, you will simply have to test to discover if the performance is acceptable. But of course such testing is already part of your development process, right?
I'd like to thank John Witherspoon for introducing me to the AsOf tables concept and Henric for pointing out the solution using the rangesum function in combination with the aggr function.
Hi community,
I assume we all, once in a while, faced this question : "Is it possible to send a table from QlikSense to our WebService/database?"
So, without further due, the answer is : YES !
I'll now share with you how I did to implement this:
0) prerequisites : The WEB SERVICE is already created.. (obviously)
1) This is a Qlik Sense Table
table:
//This is the alert table, in which will be sent to the WS
load alert,RowNo() as row, timestamp(date#(date,'DDMMYYYY'),'YYYY-MM-DD hh:mm:ss.ff') as date;
load * inline [
alert, date
alert1,06/12/2017
alert2,06/12/2017
];
PS : The row field is created to browse all the alerts to send.
2) Make a Rest Connector with your Web Service (POST method):
3) Import the data within the WS:
Result:
LIB CONNECT TO 'Post Alert';
RestConnectorMasterTable:
SQL SELECT
"alertName",
"dateCreation",
"dateValidation",
"utcodeValidator"
FROM JSON (wrap on) "root";
[root]:
LOAD [alertName] AS [alertName],
[dateCreation] AS [dateCreation],
[dateValidation] AS [dateValidation],
[utcodeValidator] AS [utcodeValidator]
RESIDENT RestConnectorMasterTable;
DROP TABLE RestConnectorMasterTable;
4) The 4th and most important step, is to modify the script above, to control the request sent to the WS.
For that, it's mandatory to add our Request Body
a)
RestConnectorMasterTable:
SQL SELECT
"alertName",
"dateCreation",
"dateValidation",
"utcodeValidator"
FROM JSON (wrap on) "root" WITH CONNECTION(
BODY "$(vRequestBody)"
);
[root]:
LOAD [alertName] AS [alertName],
[dateCreation] AS [dateCreation],
[dateValidation] AS [dateValidation],
[utcodeValidator] AS [utcodeValidator]
RESIDENT RestConnectorMasterTable;
DROP TABLE RestConnectorMasterTable;
b) Let's now define the request body.
Depends on the WS (how It's constructed) ; This phase is handled by experts, as Qlikers, we just ask the question to know what to do..)
FIRST CASE: send the alerts one by one :
For this case: the request body for our Web Service should be as follow:
send :
{"alertName": "Alert1", "dateCreation": "2017-12-06 14:59:25.00"}
then
{"alertName": "Alert2", "dateCreation": "2017-12-06 14:59:26.00"}
/this the part where we use the rest connector to send the table alert to the DB
//each time we peek and alert and its date and store them in variables
for a=0 to FieldValueCount('row')-1
Let vAlert = Peek('alert',$(a), 'alerts');
Let vDateCreation = Peek('DateAlertes',$(a),'alerts');
//here we write our body request using the created variables and send them (one by one)
//This is the Json Request to post in the Web Service
LET vRequestBody ='{';
Let vRequestBody = vRequestBody&'"alertName":"$(vAlert)",';
Let vRequestBody = vRequestBody&'"dateCreation":"$(vDateCreation)"';
Let vRequestBody = vRequestBody&'}';
let vRequestBody = replace(vRequestBody,'"', chr(34)&chr(34)); //this is mandatory
LIB CONNECT TO 'Post Alert';
RestConnectorMasterTable:
SQL SELECT
"alertName",
"dateCreation",
"dateValidation",
"utcodeValidator"
FROM JSON (wrap on) "root" WITH CONNECTION(
BODY "$(vRequestBody)"
);
[root]:
LOAD [alertName] AS [alertName],
[dateCreation] AS [dateCreation],
[dateValidation] AS [dateValidation],
[utcodeValidator] AS [utcodeValidator]
RESIDENT RestConnectorMasterTable;
DROP TABLE RestConnectorMasterTable;
next a
SECOND CASE: send all the alerts in one shot :
For this case: the request body for our Web Service should be as follow:
send:
{{"alertName": "Alert1", "dateCreation": "2017-12-06 14:59:25.00"},{"alertName": "Alert2", "dateCreation": "2017-12-06 14:59:26.00"}}
Let vRequestBody ='[';
for a=0 to FieldValueCount('row')-1
Let vRequestBody = vRequestBody& '{"alertName" :'&'"'& Peek('alert',$(a), 'alerts')&'"'&',' & '"dateCreation":'&'"' & Peek('DateAlertes',$(a), 'alerts')&'"'&'},';
next a
// this below is to eliminate the last ',' created
Let vRequestBody = left(vRequestBody,len(vRequestBody)-1);
Let vRequestBody=vRequestBody& ']';
let vRequestBody = replace(vRequestBody,'"', chr(34)&chr(34));
LIB CONNECT TO 'Post Alert';
RestConnectorMasterTable:
SQL SELECT
"alertName",
"dateCreation",
"dateValidation",
"utcodeValidator"
FROM JSON (wrap on) "root" WITH CONNECTION(
BODY "$(vRequestBody)"
);
[root]:
LOAD [alertName] AS [alertName],
[dateCreation] AS [dateCreation],
[dateValidation] AS [dateValidation],
[utcodeValidator] AS [utcodeValidator]
RESIDENT RestConnectorMasterTable;
DROP TABLE RestConnectorMasterTable;
That's all folks.. Hope this could be of a help for someone out there
Omar BEN SALEM.
Hello there!
Solving the Qlik Cloud Content Embedding Puzzle with JWT Iframe Magic
Welcome to a tutorial designed for those navigating the waters of embedding Qlik Cloud content while ensuring a seamless, invisible authentication experience. Say goodbye to complexities and embrace a dynamic way to present Qlik Cloud app/sheet using iframes.
The default behavior of embedding an iframe with tenant URL, app ID, and sheet ID usually leads to unwanted redirections, demanding logins before content becomes visible.
Enter JWT authentication, the solution that sends a payload to Qlik Cloud, signs the response, and generates a JWT token for seamless authentication.
Let's dive in!
Step 1: Enabling JWT for Your Qlik Cloud Tenant Begin by enabling JWT for your Qlik Cloud tenant. This involves raising a support case with Qlik to activate JWT.
Step 2: Crafting the Key Pair for JWT Configuration As JWT gets activated, generate a public/private key pair to configure JWT authentication for Qlik Cloud. Follow this guide.
Step 3: Configuring JWT IDP Once JWT is live in your Qlik Cloud tenant, configure the JWT Identity Provider (IDP) using details obtained from the public/private key pair. Follow this guide.
Step 4: Preparing the Iframe Presentation
Gather essentials:
Take action:
Step 5: Interacting with the JWT Endpoint
This thrilling step involves:
This process allows us to authenticate (POST) against https://yourcloudtenant/login/jwt-session
. This session creation facilitates content presentation through iframes.
For implementation, I've used node.JS, requiring modules such as fs, jsonwebtoken, and uid-safe. Follow the provided links for installation instructions.
Install node.JS: Follow the installation guide.
Install module uid-safe: Check out the installation details here.
Install module jsonwebtoken: Install it via this link.
Install module fs: Find installation instructions here.
Create the .js
file, e.g., generate.js
, to handle payload/signing and JWT token output. Replace placeholders with your specifics.
Here's a peek at the code:
const fs = require('fs');
const jwt = require('jsonwebtoken');
const uid = require('uid-safe');
const privateKey = fs.readFileSync('<the path to your privatekey.pem, for example: certificates/privatekey.pem>', 'utf8'); // Adjust the path and encoding
const signingOptions = {
keyid: '<the keyid. You can copy this when you create the JWT IDP on your cloud tenant>',
algorithm: 'RS256',
issuer: '<the issuer. You can copy this when you create the JWT IDP on your cloud tenant>',
expiresIn: '30s',
notBefore: '-30s',
audience: 'qlik.api/login/jwt-session',
};
const payload = {
jti: uid.sync(32),
sub: '<the subject of the user>',
subType: 'user',
name: '<full name of the user>',
email: '<e-mail address of the user>',
email_verified: true,
groups: []
};
const myToken = jwt.sign(payload, privateKey, signingOptions);
// Output the token
process.stdout.write(myToken);
To proceed, some aspects within the JWT payload need customization. Let's address these requirements:
sub: The "sub" field within the payload represents the user ID used to transmit the payload. Obtain this ID by accessing the URL https://yourcloudtenant/api/v1/users/me
after logging into your Qlik Cloud tenant. The URL will display relevant information on-screen; focus on the URL itself. You'll find a segment like "****-ugW-**-nGH8sRxq97ksBpwx." Remove the asterisks and utilize this value for the "sub" entry.
name and email: These fields are straightforward. Retrieve the user's full name and email from the same endpoint mentioned earlier. Alternatively, you can click the "i" icon in your user list within the Management Console to access this information.
keyid and issuer: These values are furnished during JWT IDP creation. When you set up the JWT Identity Provider, these details will be visible. Finally, the const privateKey
should point to the location of your .pem
file on the server.
Make sure to make these adjustments to ensure the JWT payload accurately reflects the user and adheres to the requirements of your Qlik Cloud integration.
This .js
file interacts with the JWT endpoint and prepares the authentication token.
Testing JWT Token Generation
Now comes the exciting part – verifying the successful generation of the JWT token. To do this, execute the following command in your terminal:
node <filename.js>
Upon running this command, your terminal will display the resulting JWT token, which appears as a lengthy encrypted string. This token is crucial for establishing secure authentication within your Qlik Cloud integration.
Here's a glimpse of the token (partially redacted for security reasons):Every character within this token holds encrypted information necessary for the secure operation of your integration.
eyJhbGciOi....7sNtjlad6CJA1AV2552GGMclPXBPdfXUyrxGzNvzN0L9Rr8jhB4AyrVVXUsBjgCo6VbF6WogxPbk7P07OPN-P9awEG_7oCVaBK9gJUXT4GSI8moy7VOMFHO6tzLennruCiHIMUhKy9L3ncXIzwe9vEvllZ1bvs
Remember, your Qlik Cloud experience is about to get smoother as this token plays a pivotal role in seamless and secure authentication. Now that you've got the token, it's time to move on to the next steps.
Step 11: Validating the JWT Token
You can verify the validity of the generated JWT token by using Postman, a versatile API testing tool. Follow these steps:
Download Postman if you haven't already.
Create a new POST request in Postman, directing it to https://yourcloudtenant/login/jwt-session
.
Set the necessary headers:
Content-Type: application/json
qlik-web-integration-id: <Your web integration ID>
Authorization: Bearer <Your JWT Token>
If all goes well, you'll receive a 200 OK status, indicating the successful authentication of the JWT token.
Step 12: Crafting the JWT Token Endpoint
To provide a convenient endpoint for code reference, you can create a simple .php
file that executes the node generate.js
command on your server and presents the JWT token in encoded JSON format.
Here's an example of the PHP code:
<?php
header('Content-Type: application/json');
echo json_encode(['body' => `node generate.js`]);
This endpoint URL will be something like https://www.example.com/qlikcloud/jwt_token.php
.
Step 13: Building the Iframe Display Page
Finally, let's build the index.php
/index.html
file that utilizes the JWT Endpoint to acquire and utilize the token for the iframe display. You'll need to replace placeholders with your actual details:
<html>
<head>
<script src="https://unpkg.com/enigma.js/enigma.min.js"></script>
</head>
<body>
<div id="main">
<div id="message"></div>
<iframe id='qlik_frame' style='border:none;width:100%;height:900px;'></iframe>
</div>
<script>
// CONFIGURATION
const TENANT = '<yourcloudtenant>';
const JWTENDPOINT = 'https://www.example.com/qlikcloud/jwt_token.php';
const WEBINTEGRATIONID = '<web integration id>';
const APPID = '<appid>;
const SHEETID = '<sheetid>';
const IDENTITY = '<not mandatory, fill in whatever you want>';
// MAIN
(async function main() {
const isLoggedIn = await qlikLogin();
const qcsHeaders = await getQCSHeaders();
const [session, enigmaApp] = await connectEnigma(qcsHeaders, APPID, IDENTITY);
handleDisconnect(session);
const theme = await getTheme(enigmaApp);
renderSingleIframe('qlik_frame', APPID, SHEETID, theme, IDENTITY);
})();
// LOGIN
async function qlikLogin() {
const loggedIn = await checkLoggedIn();
if (loggedIn.status !== 200) {
const tokenRes = await (await getJWTToken(JWTENDPOINT)).json();
const loginRes = await jwtLogin(tokenRes.body);
if (loginRes.status != 200) {
const message = 'Something went wrong while logging in.';
alert(message);
throw new Error(message);
}
const recheckLoggedIn = await checkLoggedIn();
if (recheckLoggedIn.status !== 200) {
const message = 'Third-party cookies are not enabled in your browser settings and/or browser mode.';
alert(message);
throw new Error(message);
}
}
console.log('Logged in!');
return true;
}
async function checkLoggedIn() {
return await fetch(`https://${TENANT}/api/v1/users/me`, {
mode: 'cors',
credentials: 'include',
headers: {
'qlik-web-integration-id': WEBINTEGRATIONID
},
})
}
// Get the JWT and use it to obtain Qlik Cloud session cookie.
async function getJWTToken(jwtEndpoint) {
return await fetch(jwtEndpoint, {
mode: 'cors',
method: 'GET'
})
}
async function jwtLogin(token) {
const authHeader = `Bearer ${token}`;
return await fetch(`https://${TENANT}/login/jwt-session?qlik-web-integration-id=${WEBINTEGRATIONID}`, {
credentials: 'include',
mode: 'cors',
method: 'POST',
headers: {
'Authorization': authHeader,
'qlik-web-integration-id': WEBINTEGRATIONID
},
})
}
async function getQCSHeaders() {
const response = await fetch(`https://${TENANT}/api/v1/csrf-token`, {
mode: 'cors',
credentials: 'include',
headers: {
'qlik-web-integration-id': WEBINTEGRATIONID
},
})
const csrfToken = new Map(response.headers).get('qlik-csrf-token');
return {
'qlik-web-integration-id': WEBINTEGRATIONID,
'qlik-csrf-token': csrfToken,
};
}
// ENIGMA ENGINE CONNECTION
async function connectEnigma(qcsHeaders, appId, identity) {
const [session, app] = await getEnigmaSessionAndApp(qcsHeaders, appId, identity);
return [session, app];
}
async function getEnigmaSessionAndApp(headers, appId, identity) {
const params = Object.keys(headers)
.map((key) => `${key}=${headers[key]}`)
.join('&');
return (async () => {
const schema = await (await fetch('https://unpkg.com/enigma.js@2.7.0/schemas/12.612.0.json')).json();
try {
return await createEnigmaAppSession(schema, appId, identity, params);
}
catch {
// Handle race condition with new users who do not have permissions to access the application. The code makes another attempt after a 1.5 seconds.
const waitSecond = await new Promise(resolve => setTimeout(resolve, 1500));
try {
return await createEnigmaAppSession(schema, appId, identity, params);
}
catch (e) {
throw new Error(e);
}
}
})();
}
async function createEnigmaAppSession(schema, appId, identity, params) {
const session = enigma.create({
schema,
url: `wss://${TENANT}/app/${appId}/identity/${identity}?${params}`
});
const enigmaGlobal = await session.open();
const enigmaApp = await enigmaGlobal.openDoc(appId);
return [session, enigmaApp];
}
// BONUS! DYNAMICALLY FETCH THEME
async function getTheme(enigmaApp) {
const createAppProps = await enigmaApp.createSessionObject({
qInfo: {
qId: "AppPropsList",
qType: "AppPropsList"
},
qAppObjectListDef: {
qType: "appprops",
qData: {
theme: "/theme"
}
}
});
const appProps = await enigmaApp.getObject('AppPropsList');
const appPropsLayout = await appProps.getLayout();
const theme = appPropsLayout.qAppObjectList.qItems[0].qData.theme;
return theme;
}
// HANDLE ENGINE SESSION CLOSURE
function handleDisconnect(session) {
session.on('closed', () => {
const message = '<Your text here> Due to inactivity or loss of connection, this session has ended.';
document.getElementById('qlik_frame').style.display = "none";
document.getElementById('message').innerHTML = message;
});
session.on('suspended', () => {
const message = '<Your text here> Due to loss of connection, this session has been suspended.';
document.getElementById('qlik_frame').style.display = "none";
document.getElementById('message').innerHTML = message;
});
window.addEventListener('offline', () => {
session.close();
});
}
// HELPER FUNCTION TO GENERATE IFRAME
function renderSingleIframe(frameId, appId, sheetId, theme, identity) {
const frameUrl = `https://${TENANT}/single/?appid=${appId}&sheet=${sheetId}&theme=${theme}&identity=${identity}&opt=ctxmenu,currsel`;
document.getElementById(frameId).setAttribute('src', frameUrl);
}
</script>
</body>
</html>
Customizing Integration Parameters
Before proceeding, let's customize a few essential parameters that will tie your integration together seamlessly. These values act as the foundation for your Qlik Cloud integration, enabling it to function smoothly. Let's take a moment to personalize them:
TENANT
: Replace with your Qlik Cloud tenant URL.JWTENDPOINT
: Set this to the endpoint where your JWT token will be generated, for instance: https://www.example.com/qlikcloud/jwt_token.php
.WEBINTEGRATIONID
: Input your unique web integration ID.APPID
: Insert the App ID of the Qlik Cloud app you want to embed.SHEETID
: Specify the Sheet ID of the specific sheet you wish to display.IDENTITY
: Optionally, add a unique identity to enhance tracking and management.With these values tailored to your integration's requirements, you're ready to proceed to the next step.
Witness Your Integration in Action
Now that everything is in place, it's time to witness the magic unfold. By serving your index.php
or index.html
file, you'll see your Qlik Cloud integration come to life.
Prepare to be amazed as the iframe showcases your chosen app and sheet, all seamlessly authenticated without any hassle. Sit back and enjoy the smooth presentation of data within the iframe.
Step 14: Conclusion
And there you have it! By following this thorough guide, you've accomplished the intricate process of embedding Qlik Cloud content using iframes and JWT authentication. Your users can now seamlessly access the content without encountering any login hurdles.
Remember, every placeholder like <...>
needs to be replaced with your specific details to make the entire process work seamlessly.
Enjoy the enhanced user experience and feel free to reach out if you encounter any challenges or need further assistance!
So here was the situation I was facing, my boss was screaming at me to do something that didn't exist so I ... alright truth be told he was never screaming at me and he was fine with the initial work around I came up with. I was the one screaming to try and connect to the Google Gemini Pro #LLM in the same Server Side Extension manner I had been for other Large Language Models. I wanted to be able to ask a question and get a response, but also have the response be responsive if I filtered selections in the data.
The problem is that the API for Google Gemini Pro expects a nested block of JSON, and the generic Advanced Analytics Connector only sends a basic JSON block with field/value pairs. But my hero, Daniel Pilla, helped me realize that a Qlik Application Automation will accept the basic JSON field/value pair block and that I could then infuse the question and data into the URL call as needed for the Google Gemini Pro API.
Feel free to watch the video and skip all of the yapping I do after it. Or if you think it might be helpful and actually want to try, feel free to read the step by step instructions to give it a try on your end.
To test let’s create a brand new Qlik Application Automation
Be sure to set the Start block to Triggered and keep a copy of the information that it will display for you as to how to call the automation you will need that when creating the Advanced Analytics Connector inside of Qlik Sense momentarily.
Then add an Output block that will simply output “Yeah I did something”
Create a new Qlik Sense application and go to the Load Script.
Before adding our Advanced Analytics connector we need to add some sample data to our load script to test with:
Data:
Load * Inline [
Question#, Prompt
Question 1, "Human: What can you tell me about the Qlik Associative Model and why customers love it so much? Assistant:"
];
Choose Create new connection and then choose Advanced Analytics
Simply copy the URL from the Automation you just created, and then create a HTTP Headers entry for X-Execution-Token as shown and the token provided.
In the Response Table section simply input any table name value you wish for the Name of Returned Table and uncheck the box that says Load all available fields, and set the Table Field (JMESPath) to response and response.
You will see why based on the data in a second but for now just enter Question# as the Association Field and be sure the Send Association Field box is not checked:
Save your newly created Advanced Analytics connector.
Now choose the Select Data icon for your Advanced Analytics test connection
After you type in the name of our sample data table “Data” the connector will then show you what information will be returned. Notice in my image it’s the AnyNameYouWant table because that is literally what I input, your screen will contain the name you used.
Run the load script and you will see that you sent 1 row and you got 1 row back:
Keep in mind that we didn’t do anything special in our Qlik Application Automation yet, we are simply using this as a test to understand HOW the Advanced Analytics Connector works. So let’s go over to the Qlik Application Automation you created and see what we learned.
Click on your Output block and delete the “Yeah I did something” text that was being output previously:
Now from the dropdown for Data to output choose the right arrow icon for Output from start because we want to see what it actually contains:
Not surprisingly it shows us that it received a simple record. Now click the Toggle view mode icon so that you can see what I mentioned at the onset. It is the very simple JSON block with a simple field/value pair that is passed in:
With that knowledge untoggled the View mode so we see the array instead and type
“You asked for: ” then click the Prompt field and choose Select first item from list. (While you saw it knows we only passed 1 entry, it assumes the body can be an array. So we need to select the first one, which is all we would ever pass it.)
Instead of calling the automation from the load script again, lets build a simple Analytics sheet instead.
Simply add a Text & Image chart to your new sheet.
Now add a measure
And use the following string as your expression. The connectionname you use is the one you created and you can copy it right from the load script if needed to ensure you have spelling and spacing correct:
Your measure expression should be:
endpoints.ScriptEvalStr('{"RequestType":"endpoint", "endpoint":{"connectionname":"Advanced_Analytics"}}', 'Tell me a dad joke' as Prompt, 'Qlik Dork' as UserName)
The measure we added is evaluated and our Text & Image Chart now contains the response from our Automation:
Now let’s go back to the Automation and take another look at it. We know we were just forcing the Output value, it’s the Start block that we care about. Notice that now the Start block had a plain JSON body with both fields.
Now let’s add a Table object to our screen. Add the Question# and the Prompt as dimensions. Then add a measure with the following expression:
endpoints.ScriptEvalStr('{"RequestType":"endpoint", "endpoint":{"connectionname":"Advanced_Analytics"}}', Prompt)
Yeah … we can call our Automation as an SSE style Advanced Analytics connector and we know that we can pass fields from our actual data, or made up values as needed.
We know there is nothing super exciting about getting a return value that repeats your question. But this post is just the foundation for you to build upon, which is exactly what I did and what I refer to in my video. Who knows what you may try and utilize the Advanced Analytics Connector for and when you encounter a red screen error or something you will know just what the issue may be.
On my end, one Dan helped me know I could call my Application Automation from an Advanced Analytics Connector, the rest of my problem was resolved quickly. All I simply had to do was add the simple basic URL block to my automation and call the Google Gemini Pro API.
In a similar fashion to what it expects, their API returns a nested block. I was able to pull that value out and return it as the result.
Hi Folks,
If the Instagram account is public, we can fetch the number of likes and comments based on below scripts in Qlikview.
Script:
LOAD
TextBetween([@1:n],'edge_media_preview_like":{"count":',',') as Likes,
TextBetween([@1:n],'"edge_media_to_parent_comment":{"count":',',') as Comments,
SubField(FilePath(), '/', -1) as PhotoID,
FilePath() as Url
FROM
[https://www.instagram.com/p/B4-I3zrAlTn]
(fix, utf8, no labels)
where wildmatch([@1:n], '*"edge_media_preview_like"*');
output:
Hope it helps for someone
Hello, We have a requirement for user to input data in Forecast Column and write back data to SQL Server.
In the attached image, User should have capability to enter data in Forecast column and able to submit to SQL server database table.
Any ideas on how to implement this and also if there are other tools (third party) or extensions, we are ok to try them out.
Thanks
Kalyan
Hi,
have you ever needed to delete the fields of a table with hundreds of columns that were actually unused (i.e., empty or zero)?
Here is the function that will INSTANTLY allow you to remove the columns without values:
sub ClearFields(tName) //pass the table name AFTER a Load *...
let cancellati=0;
let totfield=NoOfFields('$(tName)');
let vElenco='';
for i=0 to totfield
let vFName = PurgeChar(FieldName($(i),'$(tName)'),'[]"');
let vCnt=FieldValueCount('$(vFName)');
if vCnt=1 then
vElenco = vElenco & ',[$(vFName)]';
let cancellati=cancellati+1;
end if
next i
if cancellati>0 then
vElenco = mid(vElenco,2);
drop fields $(vElenco) from $(tName);
trace Tabella $(tName): cancellati $(cancellati) campi su $(totfield) iniziali presenti;
else
trace Table $(tName): nessun campo eliminato su $(totfield);
end if
end sub
A while ago, I conducted an internal research project with my colleagues @AlaneMiguelis, and Dr. @priscilarubim, aiming to validate the accuracy of a Two-Sample T-Test in Qlik by comparing it with the medical research gold standard: R. While we explored various hypothesis testing methods, we prioritized formalizing our findings on the T-Test. Recently, we decided to make this document available to the whole community.
The focus of our project was to assess whether the statistical results produced by Qlik matched those generated by R, particularly for the Two-Sample T-Test. Our methodology involved rigorous data analysis and repeated testing to ensure the reliability and consistency of our results. Through our detailed comparative analysis, we aimed to provide insights into the robustness of Qlik’s statistical capabilities. The code and dataset is available in a Git repository for those who want to reproduce the study.
Here is the abstract of our study. The full document can be accessed through the link at the end of this post.
ABSTRACT
Objective
The statistical determination of a large or small difference between two groups is not based on an absolute standard, but is rather an evaluation of the probability of an event.1,2 In the field of medical research, it is common to use statistical software for descriptive statistics as well as to perform statistical tests.3 However, most software provides ready-to-use functions, and the researchers have almost no information as to how those statistical tests are calculated inside those functions. This article evaluates the accuracy of two-sample Student’s t-test using Qlik analytics software. The gold standard used for this evaluation is the set of standard t-test functions available in R software, a widely used, robust, and reliable statistical software.5–7
Materials and Methods
The tests performed in this evaluation used a subset of Framingham heart study data. The dataset contains data on 4,434 anonymous participants, collected in three periods apart from each other by 6 years from 1956 to 1968. Five t-tests with 2 scenarios each were performed in Qlik analytics and in R and the results compared.
Results
In general, the results for multiple statistics obtained in Qlik analytics match the ones found in R for multiple scenarios: small and large sample sizes, small and large p-values, assuming and not assuming equal variance.
Discussion
Although Qlik analytics matches all statistics for t-test found in R, the p-value only matches up to four decimal points, which is concluded to be enough for testing hypothesis since the conventional levels of significance do not go lower than 0.1.
Conclusion
This research concluded that Qlik analytics can be used for two-sample t-tests in multiple scenarios.
Keywords: Qlik, t-test, r language, Framingham.
Link for the full article: Comparing Qlik Analytics and R Software for Accurate Two-Sample T-Tests Research Paper (ipc-global.com)
QlikView > Qlik Sense > Qlik Cloud...
Replatform is challenging and sometimes, different platforms have to coexist and interact to each other.
In this article I will show how can you call QlikView Tasks from Qlik Sense - and potentially from Qlik Cloud - using PowerShell Scripts.
# Open Powershell as Admin and Execute the following to allow Powershell Scripts execution
# Set-ExecutionPolicy -ExecutionPolicy RemoteSigned
# Set-ExecutionPolicy Unrestricted
The PowerShell script is simple as it shows below. It requires two parameters:
param (
[Parameter(Mandatory=$true)]
[string]$QlikViewQMCServiceUrl,
[Parameter(Mandatory=$true)]
[string]$TaskID
)
$service = New-WebServiceProxy -Uri $QlikViewQMCServiceUrl -Namespace QlikViewServer -UseDefaultCredential
$serviceKey = $service.GetTimeLimitedServiceKey()
$hdrs = @{}
$hdrs.Add("SOAPACTION","http://ws.qliktech.com/QMS/12/2/IQMS2/RunTask")
$hdrs.Add("Content-Type", "text/xml;charset=utf-8")
$hdrs.Add('X-Service-Key',$serviceKey)
$body = @{}
$body = '<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/">
<s:Body>
<RunTask xmlns="http://ws.qliktech.com/QMS/12/2/">
<taskID>' + $TaskID + '</taskID>
</RunTask>
</s:Body>
</s:Envelope>'
$res = Invoke-WebRequest -Uri $QlikViewQMCServiceUrl -Method Post -Body $body -UseDefaultCredential -Headers $hdrs
We can Run the PowerShell Script in multiple ways:
For now, we'll run it through Qlik Sense Load Script:
EXECUTE powershell.exe -ExecutionPolicy Bypass -File "$(vTempPowerShellScriptFile)" -QlikViewQMCServiceUrl "$(vTempQlikViewQMCServiceUrl)" -TaskID "$(vRunningTask)";
To call the EXECUTE statement in the Load Script you must enable the Legacy Mode in your Qlik Sense Server.
That is it! Now from Qlik Sense Server we can call QlikView Tasks with minimum changes in our server environments.
The PowerShell Script in my example contains only the core of its functionality. I strongly recommend including log messages and error handling through try/catch/finally blocks.
I haven't tested this but here is my theory:
That should do the magic. Please let me know if you have tried this and if it has worked.
I spent a bunch of time on and off trying to figure how to generate an email alert when a reload fails. It's something I and I'm sure many others wish was built-in and worked out-of-the-box, but it's actually somewhat possible. There are some restrictions - for me, I sometimes get reloads that hang, even though I put in a max retry, and since they don't "fail," I never get the notification. And obviously, the email notification is generalized to the point where you have to log into the QMC to discover which app failed. But if you're a Qlik Sense admin, you're smart enough how to make customized email alerts for each reload task :).
Setting up Qlik Web Connectors
Setting up Task Failure Triggers
Editing the Notification App
As written above, the actual script that runs the SMTP Connector is complete and already published to Qlik Sense. In other words, if you ever need to edit or re-publish the app or change the SMTP settings, there is little to do. Details below:
[Redacted] QS Script for Send Email app
/**** you shouldn't have to modify these after setup ****/
let vQwcConnectionName = '[web file connector]';
let vTimestamp = timestamp(now());
/**** Edit these variables to change the email contents. ****/
let vMessageInput = 'Qlik Sense Task Failure at ' & '$(vTimestamp)' & '.<br><br>To Review, <a href="https://[Qlik Sense URL].com/qmc">go to the QMC.</a>';
let vSubjectInput = 'Qlik Sense Task Failure';
let vSMTPserver = ' ';
let vFromName = ' ';
let vSMTPPort = '25';
/**** use %40 in place of the @ symbol ****/
let vToEmail = '';
let vFromEmail = '';
/**** DO NOT EDIT ****/
/**** These variable text strings properly encode the text in the "input" variables ****/
/**** Replaces all spaces with plus signs in the message string above ****/
let vMessage = replace('$(vMessageInput)',' ','+');
let vSubject = replace('$(vSubjectInput)',' ','+');
/**** DO NOT EDIT ****/
SMTPConnector_SendEmail:
LOAD
status as SendEmail_status,
result as SendEmail_result,
filesattached as SendEmail_filesattached
FROM [$(vQwcConnectionName)]
(URL IS [http://localhost:5555/data?connectorID=SMTPConnector&table=SendEmail&SMTPServer=$(vSMTPserver)&Port=$(vSMTPPort)&SSLmode=None&to=$(vToEmail)&subject=$(vSubject)&message=$(vMessage)&html=True&fromName=$(vFromName)&fromEmail=$(vFromEmail)&delayInSeconds=0&ignoreProxy=False&appID=], qvx);
// IMPORTANT: If, when loading the above script, you receive a 'Script Error' Dialog box with a 'Field Not Found'
// (or other) error, the first thing you should do is copy and paste the full request URL (i.e. the URL between the square [...]
// brackets IN THE QLIKVIEW / QLIK SENSE SCRIPT ERROR DIALOG) into a browser address bar and check the response.
This article shows an example how you can configure a Qlik Application Automation workflow to generate a simple report using Qlik Reporting Services and attach it as a PDF file to a message sent in Microsoft Teams channel. The main challenge addressed is the configuration of the necessary Microsoft Sharepoint and Teams blocks.
Content:
Necessary Automation blocks:
Create a new automation using one of the provided Qlik Reporting templates: Store a simple multi-page PowerPoint report to Microsoft SharePoint
Configure Qlik Reporting blocks to generate the report needed. For more information on how to generate a PDF report using Qlik Application Automation please refer to previously posted articles (see links below).
Here is an example of a simple one-page report:
The next step is to configure Microsoft Sharepoint automation block to copy the generated PDF report to the site of your interest. Certain input parameters were already pre-filled by the template, which was used to generate this automation.
There are several ways to find the relevant information (see links to Microsoft documentation below). This article shows one example way, which you can use.
Let's assume an example scenario, where you want to share the generated PDF report in a teams channel named "My Channel". Under the files tab, you can view the files stored in the Sharepoint attached to your channel. There you can create a new folder, for example named "General", where you will store the generated report files.
To find the driveid and siteid of this Sharepoint site, you can use Microsoft Graph Explorer and retrieve the necessary information by executing a chain of API calls.
1. First, you must log in to your Microsoft account:
2. Knowing the name of your Teams channel ("My Channel"), you can start a search for sites with this keyword:
https://graph.microsoft.com/v1.0/sites?search={Keyword}
Looking at the returned id, it seems like several siteids (separated by a comma) were concatenated as the example Sharepoint site contains one or more folders.
Note: this might look different in your scenario, however the returned id parameter is the one, which is required for the next steps.
3. This id parameter can now be used as siteid for the next API call, to find the rest of the relevant information for the target folder ("General").
https://graph.microsoft.com/v1.0/sites/{siteid}/drive/items/root/children
Having found the name of the target folder ("General" in the example scenario), you can copy the relevant information into Qlik Application Automation. As this information will be used across multiple automation blocks, it is recommended to store the ids in the respective variables:
4. Configure the "Copy File on Microsoft Sharepoint" block using the new variables. Additionally, adjust the destination path to use the folder "General".
Having stored the newly created file in Sharepoint, you must now retrieve the eTag of this file to be able to attach it to a message in Microsoft Teams.
Use the Microsoft Sharepoint automation block "List Items on Drive" to iterate over items in a certain folder and find the newly generated file dynamically. Use the initially created variables Driveid and Itemid as input parameters:
Include a "Condition" block into the loop to check, whether the file name corresponds to the name of the generated report you are looking for:
if {$.listItemsOnDrive.item.name} = {$.generateReport.path}
In case you have found the correct file name, store the eTag and webUrl parameters of this file in variables.
{substr: {$.listItemsOnDrive.item.eTag}, 1, {subtract: {textlength: {$.listItemsOnDrive.item.eTag}}, 4}}
{$.listItemsOnDrive.item.webUrl}
Exit the loop, as you have already found the necessary information.
Insert a Microsoft Teams automation block "Send Message" at the end of your automation. Use the lookup functionality to define your team and channel id, where the message will be posted.
Write the content of your message. Use the following tag in the Content field to insert an attachment into the message. The id of the attachment should be the eTag of the file previously stored in the "FileTag" variable.
<attachment id="{$.FileTag}"></attachment>
Define the attachment code as follows:
[{"id": "{$.FileTag}",
"contentType": "reference",
"contentUrl": "{$.webUrl}",
"name": "{$.generateReport.path}"
}]
This is how it might look like in the end:
Save the automation and execute it. You will now receive a message in your Teams Channel with the content and attachment you have defined:
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
You can load images stored in a database in BLOB format into Qlik Sense by converting to Base64 encoded format in the SQL load script.
Further more, you can render the image in the Sense UI using the native Qlik Sense Map object. This supported object supports expression based sourcing / displaying of the image both as a data field and as a URL to a HTTP accessible image.
The attached example walks through the setup using the sample AdventureWorksDW SQL database (which contains image files stored as BLOBs) along with both Qlik Sense and NPrinting.
This technique provides a similar alternative to the BUNDLE load technique which was available in QlikView.
The purpose of the attached document is to help you understand the options and be able to configure and test the Anthropic (Amazon Bedrock) connector. For the purposes of the document assume that our goal is to create a connection that we could utilize to ask questions of the data that we pass to Amazon Bedrock. Within this document we will only get your connection tested, but the accompanying Amazon Bedrock Tutorial application will allow you to pass sample data and ask questions of it so that you can test that functionality as well.
Bonus
Reading is one thing, but there is nothing like seeing it in action and getting your hands on. To that end, the attached Amazon Bedrock Tutorial application will let you create a connection, using the guide, and then step through learning exercises to fully understand how you would pass data from within Qlik Sense to Amazon Bedrock and ask questions of it.
The purpose of the attached document is to help you understand the options and be able to configure and test the Azure OpenAI connector. For the purposes of the document assume that our goal is to create a connection that we could utilize to ask questions of the data that we pass to Azure OpenAI. Within this document we will only get your connection tested, but the accompanying Azure OpenAI Tutorial application will allow you to pass sample data and ask questions of it so that you can test that functionality as well.
As the Azure OpenAI Connector will require several parameters that you have to input by hand, the document starts by helping you understand how to know which values to use.
🏆Bonus 🏆
Reading is one thing, but there is nothing like seeing it in action and getting your hands on. To that end, the attached Azure OpenAI Tutorial application will let you create a connection, using the guide, and then step through learning exercises to fully understand how you would pass data from within Qlik Sense to Azure OpenAI and ask questions of it.