Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
This forum is where any logged-in member can create a knowledge article.
The purpose of the attached document is to help you understand the options and be able to configure and test the Anthropic (Amazon Bedrock) connector. For the purposes of the document assume that our goal is to create a connection that we could utilize to ask questions of the data that we pass to Amazon Bedrock. Within this document we will only get your connection tested, but the accompanying Amazon Bedrock Tutorial application will allow you to pass sample data and ask questions of it so that you can test that functionality as well.
Bonus
Reading is one thing, but there is nothing like seeing it in action and getting your hands on. To that end, the attached Amazon Bedrock Tutorial application will let you create a connection, using the guide, and then step through learning exercises to fully understand how you would pass data from within Qlik Sense to Amazon Bedrock and ask questions of it.
The purpose of the attached document is to help you understand the options and be able to configure and test the Azure OpenAI connector. For the purposes of the document assume that our goal is to create a connection that we could utilize to ask questions of the data that we pass to Azure OpenAI. Within this document we will only get your connection tested, but the accompanying Azure OpenAI Tutorial application will allow you to pass sample data and ask questions of it so that you can test that functionality as well.
As the Azure OpenAI Connector will require several parameters that you have to input by hand, the document starts by helping you understand how to know which values to use.
🏆Bonus 🏆
Reading is one thing, but there is nothing like seeing it in action and getting your hands on. To that end, the attached Azure OpenAI Tutorial application will let you create a connection, using the guide, and then step through learning exercises to fully understand how you would pass data from within Qlik Sense to Azure OpenAI and ask questions of it.
This article shows you how to configure two Talend Data Catalog bridges to harvest metadata from Snowflake and trace data lineage. It covers the Snowflake Database (via JDBC) and Snowflake Database SQL DML (DI/ETL) Script (SnowSQL) Talend Data Catalog bridges.
Configure the import setup parameters for harvesting metadata from a Snowflake database in Talend Data Catalog:
Define the following parameters on the Import Setup tab: Host, Warehouse, Databases, and the User and Password for authentication.
Select the schemas to be harvested in the Schema field. This example uses the TDC_TEST, TESTSCHEMA1, and TESTSCHEMA2 schemas.
Go to the Overview tab to view an overview of the objects harvested from the Snowflake database.
This example SnowSQL script contains a create view Data Manipulation Language (DML) statement:
create view IF NOT EXISTS testschema2.v_employees ( employee_id, employee_name ) as select rvalue:employee_id::number, rvalue:employee_name::string from testschema1.employees;
You will harvest metadata from this example SnowSQL script in the steps below.
To configure the parameters for the Snowflake Database SQL DML (DI/ETL) Script (SnowSQL) bridge in Talend Data Catalog:
Define the Directory where the SnowSQL file is located on the Import Setup tab.
Add any necessary Include and Exclude filters to select the files to be harvested.
Go to the Overview tab to view the imported SnowSQL script.
View the connection defined for the SnowSQL model in the Configuration Manager.
Notice that after harvesting metadata (database objects and SnowSQL) using both bridges and building the configuration, the Architecture Diagram displays a bi-directional arrow, because the same Snowflake database model is the source and destination connection to the database.
Go to the Data Flow tab to view the data flow diagram for the SnowSQL script. Referring to the SnowSQL script, Talend Data Catalog creates a view on one schema (TESTSCHEMA2) with a select statement on another schema (TESTSCHEMA1).
You can also view the data flow diagram for the database object (schema level), which displays the three schemas.
Finally, you can display the data lineage for a column (EMPLOYEE_NAME).
QUESTION:
How to display or hide one or more internal sheets of a one o more Qlik Sense App.
ANSWER:
For convenience, I use a simple INLINE table that reproduces 2 fields necessary for dynamic matching when opening the sheet: USER ID and SHEET ID
SA_SHEET:
LOAD * INLINE [
UTENTE,SHEET
55ad8d29ed671d80ecac0d1b4594546e8f88d730d23a63b4da55f51ed77f4ffb,8f3e7e83-00ef-4c34-9458-54ba3840c123
55ad8d29ed671d80ecac0d1b4594546e8f88d730d23a63b4da55f51ed77f4ffa,0af346e4-b0b3-4a20-b2ee-bf9e869412e2
];
// the green color is the correct record to match (ID USER and ID SHEET)
Add 3 variables:
the first in the load script;
LET USER = SUBFIELD(OSUSER(),'|',-1); // Extract the User ID session login ( one value only)
the second in variable editor:
ObjectId('sheet') // One value for each sheet in app
the third variable in the visualization condition of sheet properties:
=index(concat( {< UTENTE = {"$(USER)"} >} distinct SHEET,' '),$(vSHEET_ID))
the result will be that when the user ID and the sheet ID are equal to the user of the current session who opens the sheet within the App, the formula returns a value greater than zero, enabling the sheet.
These variables can be inserted more conveniently in an TXT file and loaded with an INCLUDE instruction so as to be able to quickly insert the code into all the published APPs (maximum efficiency) as for the INLINE table which can be included in the loading loads by hooking it if present , to that of ECTION acces for data profiling. Seeing is believing.
Good work
Valerio Fatatis wrote
Hi qlik community members, 👋
Check out this solution to create your own gradient color palette with the ColorMix2() function.
Let's see how we can achieve this result, in a simple data model.
1 – Find the largest and smallest value of your measurement.
max(total aggr(Sum([Total value]),Customer))
min(total aggr(Sum([Total value]),Customer))
2 – Calculate the difference between these values
max(total aggr(Sum([Total value]),Customer))
-
min(total aggr(Sum([Total value]),Customer))
3 – Now we will create a range, where we will identify positions 0 and 1 of the indices, or basically the values from 0% to 100%.
(sum([Total amount])
-
min(total aggr(Sum([Total amount]),Customer)))
/
(max(total aggr(Sum([Total amount]),Customer))
-
min(total aggr(Sum([Total amount]),Customer)))
4 – To apply the gradient, in your graph go to the Presentation window -> Colors and Legend -> Colors Customize by expression. In the check box, leave the option selected as shown below:
5 – Finally, in the expression insert your calculation within the ColorMix2() function.
ColorMix2((sum([Total amount])
-
min(total aggr(Sum([Total amount]),Customer)))
/
(max(total aggr(Sum([Total amount]),Customer))
-
min(total aggr(Sum([Total amount]),Customer))),red(),blue(),yellow())
For more details see this and other color functions in the qlik help.
https://help.qlik.com/en-US/sense/November2023/Subsystems/Hub/Content/Sense_Hub/ChartFunctions/ColorFunctions/color-functions-charts.htm
I hope you like it! 🙂
Attached is the qvf for anyone who wants to see the example model exposed.
Yours sincerely, Matheus
Qlik Sense is an awesome tool when it comes to Data Discovery. However, in some situations there is no straightforward solution and one must perform a little magic with associative data model.
One such situation is creating Market Basket Analysis. Challenge is to enable AND functionality for user selections instead of the standard OR. What do I mean?
Let’s use a simple sales data example. A customer buys either one product, or two or more. When we load all data it’s easy to filter orders by products.
When one product is selected, we get something like this:
When two products are selected, we get:
Selecting 2 or more products will result in logical OR condition, so we get all sales records with first or second product.
Data analysts working with similar (albeit much more complex) sales data are often interested in combination of products on sales orders. They need a logical AND selection to get only those records where first as well as the second product was sold.
How to achieve this behavior in Qlik Sense?
Starting is quite easy and without any need for scripting. Using the incredible Aggr() function we can create a calculated dimension, which will count all different products sold on each order.
When we place this dimension into a bar chart, we get a nice overview. Moreover, selecting a particular value will give us all orders where that particular count of distinct products happened.
This simple approach can help to reveal business insights which would remain hidden with standard selections in product dimension.
But, let’s go a bit further. What if we want to start from a specific product and see all different sales combinations that happened?
Field Selection AND-mode
We’ll have to do a bit of scripting, but all operations will be quite basic. There are two main approaches I use.
First approach is really simple, flexible and great for data discovery, because all associations are nicely revealed with Qlik’s green, white and grey. One drawback is that the resulting table can grow to massive proportions with really large datasets.
1. Load all orders and products again into a separate table and name the product field differently
2. Join the same table to the previous one and repeat it as many times as many AND selections you want. Beware that this is a cross join of products on order so think carefully about the scale of data and go step by step to see the impact on memory. I join two times in this example.
3. It’s good to clean the resulting table a bit with a few conditional operations in order to dispose of irrelevant or duplicate combinations (example script is in the attached document)
4. (optional) If we use IDs instead of names in previous scripts (which is recommended), then we load the product dimension (along with some categories and types for more analytical options) to a separate table for each AND field.
Now we can filter Orders by product combinations and also see the next associative options:
Second Approach and it's variation using SET analysis
This one is much less memory demanding, but it doesn’t provide such a nice associative experience. It’s used when working with really large datasets (100s millions of rows) along with small number of distinct entities (e.g. up to 10 products on one order), or when the sequence of entities is relevant (accounting, production, service desk etc.).
1. Concatenate all different products on one order to one line
2. Use SubField() function to put each product to a separate field. (Create this script with a FOR...Each loop if the maximum number of distinct products on order is not known beforehand.)
3. (optional) As with the previous approach it's recommended to use IDs and load separate product dimensions for each AND field.
If all scripts were executed successfully we get the desired AND-mode for selections in both cases.
All would be well, but there is one more requirement which is quite common – AND NOT mode. First approach will not help us with this but nothing is ever lost with Qlik. We can utilize variables and expression SET analysis along with the second approach. This basically creates a third approach > dynamic search in a list.
1. Create a concatenated field for all distinct products on order as with the second approach
2. Load product dimension again into separate tables. In our example we will have a selection for initial product(s), AND product(s) and AND NOT products.
3. Create variables that will be used to construct the SET analysis in expression.
- for initial products
- for AND products
- for AND-NOT products
4. Finally put all variables into SET analysis of the expression used in the chart
This will enable AND-mode as well as AND-NOT-mode in selections.
Check the attached file to see all 3 approaches in action. I've been successfully using these approaches in my projects, but I’m looking forward to hearing any comments and suggestions.
Radovan
PS: Thanks to Mária Šándorová ( @JaMajka1 ) for suggestions.
PPS: All challenges always have multiple solutions with Qlik. Similar functionality can be also achieved with P() and E() set functions, as described for example in this post - https://community.qlik.com/t5/Qlik-Design-Blog/Visualizing-comorbidity-Set-analysis-element-function-P/bc-p/1468751#M3621
A glance at the labels HTTP and HTTPS indicates the only difference is the addition of an S. But is that really the case? While the “S” might appear to represent a simple alphabet, it is what it represents that is so significant. Funny as it might appear the S is the reason you need to get SSL certificate for your website.
All users and webmasters should have an understanding of the basic differences between HTTP and HTTPS. In this article, we begin to understand the interior mechanisms of these protocols, which are of such importance for communication on the web. From deciphering the mechanisms of HTTP and HTTPS to exploring the evolution of protocols like HTTP/2 and HTTP/3, we illuminate the pathways of secure data transmission. Learn why HTTPS is not just an option but an essential upgrade.
Serving as the cornerstone for communication between a user's web browser and the server, the Hypertext Transfer Protocol (HTTP) comes into play each time a user clicks a link or inputs a web page URL, initiating an HTTP request to the server. What this request does is to indicate the start of the action that should be taken, specifically fetching the webpage. Then the server responds with an HTTP status code, which returns a value to indicate whether the URL was found or whether an error has occurred. With this simple request-response model, text, an image, or any kind of resource can all be seamlessly brought onto a web page, and this mechanism is the basis for web browsing as we know it today.
HTTPS (Hypertext Transfer Protocol Secure) is just an extension of the standard HTTP, allowing for more secure data transfer. The only difference is the addition of SSL (secure sockets layer) encrypted access. In the process of connecting to an HTTPS website, the first thing it does is to display a digital certificate identifying who the server is. This certificate opens up a secure, encrypted tunnel through which information flows, and it protects the information from being intercepted or manipulated by third parties.
The SSL encryption uses a symmetric and asymmetric key encryption system to protect the privacy of a site as well as the integrity of the data being sent. This strong security mechanism also protects important exchanges, like online purchases and account access, and today HTTPS is an irreplaceable protective wall in the online world. So as cyber threats continue to grow darker in hue, HTTPS has never ceased to be a brave warrior against them, protecting the information its users give and the very foundation of the trust upon which e-commerce is built.
HTTP/2, introduced as an upgrade to HTTP/1.1, revolutionized the data exchange process. Breaking with text format, HTTP 2.0 uses binary, which lowers delay and improves speed. In addition, it lets servers actively push response information to clients from the caches rather than waiting for requests, making a more connected experience possible.
HTTP/3 builds on the improvements introduced by its predecessor, aptly designed as a revolutionary frontier in real-time streaming and high-end transfer. Relying on greater efficiency, HTTP/3 promises to further refine the landscape of online interactions.
In tandem with these protocol upgrades, HTTPS addresses data security concerns within HTTP. Modern systems use HTTPS, which combines HTTP + SSL = TLS. As it evolves, HTTP/3 should effortlessly become part of the HTTPS family to continue strengthening security and increasing efficiency in a constantly-changing digital world.
Switching from HTTP to secure, encrypted HTTPS provides a host of benefits--essentially focusing on the issues of security versus authority (legitimacy), speed and efficiency.
Another inherent strength of HTTPS is the encryption it uses throughout data transmission, protecting such sensitive information as credit card details or personal data against leaks. This type of security block prevents third parties from intercepting and reading private data that the user posts on a website.
Google's search engine also gives greater prominence to HTTPS sites, which underlines their credibility. Web users are pretty good at visual cues. HTTPS-secured sites have a padlock icon in the address bar of browsers, which indicates that this is where one should shop online. This visible link enhances users 'confidence and confirms that HTTPS remains the only choice for securing information.
Web applications with HTTPS loading take less time than their HTTP counterparts. It is precisely this performance boost that user satisfaction relies on. What's more, for such things as referral information interactions with analytics software are very accurate. Without HTTPS, it would be very difficult to identify reliable referral traffic from sources such as advertisements or social media.
Allowing for HTTPS means that referral links can be tracked correctly, so the stats package you use will have a much better idea of where your best traffic is coming from. Such accuracy is essential to businesses seeking greater insights from their online efforts, and HTTPS isn't just something for those concerned about security. It can also improve a site's performance and even help with its analytics capabilities.
Apart from making a website's security more solid, adding HTTPS also helps build up distinct trust in search engine optimizations (SEO). Google, for example, has long taken security into consideration in its ranking system, and ranks HTTPS-encrypted sites higher. This preference comes from the security and integrity that HTTPS ensures, which coincides with search engines desire to bring searchers secure, trustworthy results. Even more reassuring to users is that the padlock icon should now appear in the address bar of all browsers. This reassures users and lowers the bounce rate. In the contest to rise in Google's search rankings and enhancing user trust, HTTPS is the secret weapon.
Traditionally, webmasters were saddled with costs associated with registering and maintaining SSL/TLS certificates when setting up HTTPS, but all that has changed. A lot of certificate authorities now provide free SSL certificates. Amazon Web Services (AWS), for instance, offers its AWS Certificate Manager (ACM). ACM simplifies the task by providing, managing, and deploying all the required public and private SSL/TLS certificates, automating the formerly manual tasks of purchasing, uploading, and renewing. Apart from reducing the financial burden, this accessibility to free certificates also reflects a modern trend towards making information security more accessible, which makes adoption of the HTTPS method of protecting data and winning the trust of website visitors an increasingly economical option.
Feature |
HTTP |
HTTPS |
Data Transmission |
Plain text |
Encrypted |
Security |
Vulnerable to eavesdropping and tampering |
Secured against eavesdropping and tampering |
Authentication |
No authentication, susceptible to attacks |
Uses SSL/TLS certificates for server authentication |
Protocol Prefix |
http:// |
https:// |
Default Port |
80 |
443 |
SEO Impact |
Lower search engine ranking |
Higher search engine ranking |
User Trust |
Less trustworthy, lacks visual cues |
Enhanced surety with trust indicators |
Performance |
Potentially slower due to lack of encryption |
Improved performance with SSL/TLS encryption |
Cost |
No additional cost for setup |
Traditionally required cost for SSL/TLS certificates, but free options are now prevalent (e.g., AWS ACM) |
Embracing the online era choices between HTTP and HTTPS matter not only as technical questions, they are also crucial choices regarding the online environment. With its powerful encryption, its prospects for building confidence with users, and its opportunities in terms of search engine optimization (SEO), HTTPS is the modern standard solution for secure web transmission. With the threats from the cyber world continuing to grow, widespread implementation of HTTPS is not just a security necessity; it also becomes part of the battle plan for enterprises and website operators, who hope to maintain an online environment that is secure, trustworthy, and stable.
This document demonstrates creating Flags in the script to calculate Rolling N Months data.
Same thing can be achieved in various ways using set analysis, rangesum fuction,Accumulation on front end. But if Month or MonthYear field is used in chart or any selection is done on Month or MonthYear , it will be difficult to achieve the Rolling calculation using above methods and sometimes expression becomes quite complex.
So it is better to create the Flags from back end in a master calender itself.
This method will create the flag for rolling months which can then be used in front end for selection.This provides better flexibility for users to select the Rolling periods of his choice or this flags can be used in set analysis.
In the application I have created the dummy data to link to the calender.
// Load min and max Date from Fact
MaxDate:
LOAD num(max(FieldValue('Date', recno()))) as MaxDate,
num(min(FieldValue('Date', recno()))) as MinDate
AUTOGENERATE FieldValueCount('Date');
let vMaxDate= Peek('MaxDate',0,'MaxDate');
let vMinDate= Peek('MinDate',0,'MaxDate');
// Generate Dates using min and max date
Cal:
LOAD *,
MonthName(Date) as MonthYear;
LOAD date($(vMinDate)+IterNo()-1) as Date
AutoGenerate(1)
While $(vMinDate)+IterNo()-1<=$(vMaxDate);
MaxMonthYear:
LOAD num(max(FieldValue('MonthYear', recno()))) as MaxMonthYear
AUTOGENERATE FieldValueCount('MonthYear');
// Variable used to restrict MonthYear to <=current month while looping
LET vMaxMonthYear = monthname(Peek('MaxMonthYear',0,'MaxMonthYear'));
// Define Rolling N in Inline table. 1 is the default value for current month
RollMonth:
LOAD * Inline [
RollMonth
1
2,
3,
6,
12 ];
Calender:
LOAD * Inline [
junk ];
for i=1 to FieldValueCount('RollMonth')
LET vRollMonth= FieldValue('RollMonth',$(i));
Concatenate(Calender)
LOAD Date,
MonthYear,
Rolling_Months,
month(Rolling_Months) as Month,
Year(Rolling_Months) as Year,
if(Flag='Rolling1','CurrentMonth',Flag) as Rolling_Flag
where Rolling_Months<=Date#('$(vMaxMonthYear)','MMM YYYY');
LOAD Date,
MonthYear,
monthname(MonthYear,IterNo()-1) as Rolling_Months,
'Rolling'&$(vRollMonth) as Flag
Resident Cal
While IterNo()-1<=$(vRollMonth)-1 ;
NEXT
DROP Tables Cal,MaxMonthYear,RollMonth;
DROP Field junk;
Pease find the attached QVW file
By the time I’m writing this article Qlik Cloud Service does not support yet the GCP user group during the authentication phase. Thus allow GCP users to access Qlik Cloud, base on their groups is not (yet) an option.
This is expected to be a temporary situation and Qlik will deliver soon (or later) also this feature. Nevertheless timing is important and it may avoid the SaaS adoption for Qlik enthusiast with the hard constraint to relay on Google groups.
In the context above, in this article I'm going to propose an alternative way to achieve the same business scope granting users to spaces based on their membership groups. Therefore, acting on the Authorization instead, the Authentication. This solution is imperfect, widely improvable (anyone is welcome to contribute) and to some extent graceless, but it just works and sometime could make the difference between adopting Qlik SaaS or something else.
Any authenticated user (through GCP or any other solution) will land on Qlik Cloud creating a new user for their first access. This solution is triggered on that specific event “User Creation”. Qlik Application Automation is triggered on the User creation event, and go through the following steps
Here you can find the Google API documentation . The HTTP request retrieve a paginated list of users belonging to the same {groupKey}. This HTTP request needs an authentication method, for this you need to create a service account on the google platform.
Pay attention to paste the Private Key. It is made of 3 lines. The first line is the “BEGIN PRIVATE KEY” the second line is the key itself without any carriage return line feed, the third line is the “END PRIVATE KEY” line. You should end up with something like this:
-----BEGIN PRIVATE KEY-----
<Your private Key in one single line, remove all the CR+LF or /n>
-----END PRIVATE KEY-----
Private Key id from the variable
Private Key from the variable
Time as: {number: {date: 'now', 'U'}}
Select Python as language and paste these lines:
import jwt
iat = inputs['time']
exp = iat + 3600 * 1000
payload = {'iss': 'qlik-user-sync@qliksaasidp-377811.iam.gserviceaccount.com',
'sub': 'qlik-user-sync@qliksaasidp-377811.iam.gserviceaccount.com',
'aud': 'https://admin.googleapis.com/',
'iat': iat,
'exp': exp}
additional_headers = {'kid': inputs['Private Key Id'],"alg": "RS256","typ": "JWT"}
signed_jwt = jwt.encode(payload, inputs['Private Key'], headers=additional_headers,
algorithm='RS256')
print (signed_jwt)
For more details please check this page.
Remember to use as parameter the pageToken
And in the header the authorization Bearer as follow:
Here attached to this article you can find the Automation. Remember to replace the placeholder in the Variable Private Key Id and Variable Private Key with your own keys as well as the group in the two call url blocks.
The Three-tier Architecture includes three layers in order to extract the data from the various database and store data in the qlikview data file, apply the business logic and develop the data model using QVD files and finally create the dashboard by using the second layer as a binary load which helps the business user to analyse and process the data.
1. Load
(Extract Layer and QVD Layer)
2. Transform
( QlikMart Layer)
3. Presentation (Presentation Layer)
Long time ago I have developed a vbscript application to open qlikview, reduce some data according salesman filter, export the sales pivot table to excel and send an email with the excel sheet attached to the salesman using the blat.exe application. The other day i was wondering if the same job could be executed using python. So after two minutes i was googling in search of some code snippets to easily make the python script. After half an hour I was quite disappointed to not found anything already made so I opened Pycharm and started to make the basic code: a class to make an instance of qlikview, open the document, export the table, close the document, close the qlikview app.
Here the code tested on python 3.7:
from win32com.client import Dispatch
import pythoncom
class QlikView:
def __init__(self):
self.app = Dispatch('QlikTech.QlikView')
def OpenDoc(self, docname):
doc = self.app.OpenDoc(docname)
return doc
def CloseDoc(self, doc):
doc.CloseDoc()
def ManageDocument():
docname = "c:\EXAMPLE.qvw"
q = QlikView()
version = q.app.QvVersion()
print(version)
doc = q.OpenDoc(docname)
chart = doc.GetSheetObject("exampletable")
chart.ExportBiff("c:\exampletable.xls"
q.CloseDoc(doc)
q.app.Quit()
if __name__ == '__main__':
ManageDocument()
This is the code that I've written at today , for sure i will go on and rewrite all the app to send report attached in emails, if someone need more information, just send me a message.
Qlik Cloud Data Integration (QCDI) enables data integration teams to build data integration pipelines rapidly using software automation that leverages an Extract, Load, and Transform (ELT) design pattern. A central feature of the ELT design pattern is pushdown SQL, where the transformation code generated automatically by the QCDI solution is “pushed down” to be executed on the cloud data warehouse platform, where the extreme scalability of such platforms can be leveraged fully.
One often-overlooked aspect of generating transformation code using QCDI automation is that it requires the QCDI solution to create and manage the schemas and tables against which it will execute the SQL code it generates.
While efficient and extensible, the ELT-based QCDI pipeline automation solution will create multiple schemas, tables, and views. A table that summarizes the schemas, tables, and views created by QCDI object type is included at the end of the paper. The body of this paper is devoted to describing the objects the QCDI pipeline automation creates, and their purpose, to allow your system administrators to better understand the implications to the physical database layer.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Before you start reading: The way to import .json with Qlik Sense script is undocumented and may still change at any time without notice.
Qlik script natively reads JSON (JavaScript Object Notation), it is a bit tricky and slower for large files - and an import wizard is missing, which would create the script blocks.
When you click on Add data in the connection pane, the file dialog for the folder will show no specific filter for .json files:
Choose "All files" and then select your .json file. Next you will see, that the wizard attempts to interpret it as csv. Pointless, but at least you get the FROM construct properly filled. Forget the rest, as we will replace it in a minute.
Just hit "Insert script" and then put LOAD * in front of FROM, and instead of (txt ...) put (json):
data:
LOAD *
FROM [lib://:Google_Drive/119UHVSR9_JjkT9o-mz2NAO1xyS1xU56f/01.json]
(json);
Not too complicated so far. If your .json file only has one root object (or one root array of objects) and you will already get the keys from the top level. You are already done.
Now lets learn more about the parsing!
A singe json file containing only one object ...
{"key1": 123, "key2": "abc", "key3": true, "arr": [], "obj": {} }
... will result in one line of data.
If the json file has multiple "rows" (objects in an outer array [ ... ]) you automatically will get multiple rows in Qlik.
[ {"key1": 123, "key2": "abc", "key3": true} , {"key1": 234, "key2": "def", "arr": [5,6,7]} ]
The arrays can have different keys. Common keys will concatenate in the same field, whereas keys missed out in a given 'row' will read "null" (unfortunately as text, not as Null()) in the result.
Now lets look, how to deal with sub-objects and arrays? For this we need understand the table is construct and the hash values.
That is fine for one row, but if you have a json file with many rows, you don't want to call every line with "Root/0/arr", "Root/1/arr", "Root/2/arr" ...
The good thing: You don't have to do it yourself. This article is explaining how to load .json by hand. If you want a quick solution, this one is for you: https://github.com/ChristofSchwarz/QlikScripts/tree/master/json
Assuming a .json file with this content
[ {"name": "Alex" , "age": 50, "kidsAge": [25,23,7,4] } , {"name": "Dhruv" , "age": 27, "kidsAge": [1] } , {"name": "Eyal" , "age": 35, "kidsAge": [] } , {"name": "Chris" , "age": 49, "kidsAge": [17,19] } ]
We will introduce a FOR loop now. Note, if an array is empty [] or is null, the LOAD command inside the Loop would fail on that "row", so we set ErrorMode to 0 (=continue script upon error) and reset it to the default 1 (=fail on error) after the NEXT command
SET vFile = [lib://.../filename.json]; root: LOAD name, age, kidsAge AS %kidsAge FROM [$(vFile)] (json, table is 'Root'); SET ErrorMode = 0; FOR v = 0 TO NoOfRows('root') - 1 kidsAge: LOAD DISTINCT kidsAge AS %kidsAge, NoName AS kidsAge FROM [$(vFile)] (json, table is 'Root/$(v)/kidsAge'); NEXT v SET ErrorMode = 1;
The result is a nice 1:N relation resolved.
The fieldname "kidsAge" also exists on the inner LOAD (next to "NoName") and it returns - yes - the same hashvalue as in the outer, first LOAD.
Last example is for a sub-object (instead of the sub-array): Similarily, you get hashes on the parent level and the content itself when addressing it right with the "table is" position. A FOR loop will ease things again, plus some more logic.
In this .json file we have one row where "kids" is null and the subobject doesn't always contain the key "happy"
[ {"name": "Alex" , "age": 50, "kids": {"count": 4, "happy": true } } , {"name": "Dhruv" , "age": 27, "kids": {"count": 1, "happy": true } } , {"name": "Eyal" , "age": 35, "kids": null} , {"name": "Chris" , "age": 49, "kids": {"count": 2 } } ]
We can't count on auto-concatenate any more for the sub-objects (row 4 would end up in a different table than rows 1 and 2). Also, without error-tolerance, the 3rd row import would fail. So we go for this script, where we create an empty table for the "kids" and then explicitly concatenate into it. We have to use the LOAD * functionality, because the structure of the subobjects is not identical throughout the main array. (Assuming you set the vFile yourself)
SET vFile = [lib://..../filename.json]; t_Root: LOAD // outer load * FROM [$(vFile)] (json, table is 'Root'); SET ErrorMode = 0; t_Kids: LOAD * INLINE [kids]; // start empty table FOR v = 0 TO NoOfRows('t_Root') - 1 CONCATENATE(t_Kids) LOAD DISTINCT // inner load * FROM [$(vFile)] (json, table is 'Root/$(v)/kids'); NEXT v SET ErrorMode = 1;
We will get now two tables, that are 1:1 related (or 1:1 or 0, to be precise)
Note: the inner LOAD * will return the "kids" as a field, too, containing the hash that links to the first table
The result can easily be used now in the Client
Uff, it can get tricky, but next week I will show you that all the manual steps (loop, field aliasing, concatenating) could be solved by a recursive SUB.
I hope though, that Qlik will also finish a wizard, that does that parsing on import like we have it for JSON objects received via the REST Connection or, likewise, as XML files are parsed.
Pros
Cons
How To / Missing Manual
Did You like it?
How could I make sure that certain data are only for certain users available?
The answer is SECTION ACCESS. Section access is a connection between an authorization table (commonly placed in a hidden script and/or outsourced in an include-variable) and the application data (most to a dimension table).
Further you need to enable section access within the document properties in tab open and especially to enable the strict exclusion is important then without this is section access more a comfortable usability function to avoid confusing by the users which needs only a small data-area from the application but not more suitable for real confidential data.
Introduction to Section Access
Data Reduction – Yes, but How?
Section Access: Strict Exclusion
QlikView Section Access Examples
Data Reduction Using Multiple Fields
Section Access (Data Reduction) using Active Directory
In addition to these there are more complex restrictions possible but before you build one think on alternatives like Document chaining whereby those applications then have a specialized and simpler section access.
Basics for complex authorization
Authorization using a Hierarchy
Restrictions based on section access could be applied on sheets and objects.
Sheets Security with Section Access
Sheets Security with Section Access File
Sometimes there is a need to mask or de-identify data for certain users, for this see: Mask or de-identify data for certain users using Section Access.
At least the most important remark: before thinking on section access makes one or more BACKUP's from your applications then there aren't any go backs by mistakes! One exception (which don't nullified the suggestion of having Backup's) could you find here: Bypassing QlikView Section Access – Infinity Insight Blog.
There is some content-overlapping within the above used links but I think the complement to each other is useful and of course you will find many more interesting postings here within the qlik community to these topic - the notes here are a good starting point to go further.
Have fun!
In many industries including education and healthcare, it is necessary to mask or de-identify certain sensitive fields for certain user groups but also be able to see the data in total for the purposes of comparison or benchmarking.
This can be achieved by setting a flag in the section access table for users to determine whether they should see the real data or a masked version of the sensitive fields. In the example below, the code is restricting some fields for users at the individual school level but allowing schools to compare their results to the rest of the schools in the data set.
Please refer to the attached QVF and remember that Section Access does not function in Qlik Sense Desktop and must be deployed to a Qlik Sense Server environment. I've also attached the QVW for the QlikView solution.
Notes
userA from SchoolA will see just Teacher and Student for School A records ONLY. All other schools will have the words De-identified in the Teacher and Student columns:
Credit to my colleague Yaniv Feldman yfe for this solution.
Matt Nunn
Qlik Consulting
Hello there!
Solving the Qlik Cloud Content Embedding Puzzle with JWT Iframe Magic
Welcome to a tutorial designed for those navigating the waters of embedding Qlik Cloud content while ensuring a seamless, invisible authentication experience. Say goodbye to complexities and embrace a dynamic way to present Qlik Cloud app/sheet using iframes.
The default behavior of embedding an iframe with tenant URL, app ID, and sheet ID usually leads to unwanted redirections, demanding logins before content becomes visible.
Enter JWT authentication, the solution that sends a payload to Qlik Cloud, signs the response, and generates a JWT token for seamless authentication.
Let's dive in!
Step 1: Enabling JWT for Your Qlik Cloud Tenant Begin by enabling JWT for your Qlik Cloud tenant. This involves raising a support case with Qlik to activate JWT.
Step 2: Crafting the Key Pair for JWT Configuration As JWT gets activated, generate a public/private key pair to configure JWT authentication for Qlik Cloud. Follow this guide.
Step 3: Configuring JWT IDP Once JWT is live in your Qlik Cloud tenant, configure the JWT Identity Provider (IDP) using details obtained from the public/private key pair. Follow this guide.
Step 4: Preparing the Iframe Presentation
Gather essentials:
Take action:
Step 5: Interacting with the JWT Endpoint
This thrilling step involves:
This process allows us to authenticate (POST) against https://yourcloudtenant/login/jwt-session
. This session creation facilitates content presentation through iframes.
For implementation, I've used node.JS, requiring modules such as fs, jsonwebtoken, and uid-safe. Follow the provided links for installation instructions.
Install node.JS: Follow the installation guide.
Install module uid-safe: Check out the installation details here.
Install module jsonwebtoken: Install it via this link.
Install module fs: Find installation instructions here.
Create the .js
file, e.g., generate.js
, to handle payload/signing and JWT token output. Replace placeholders with your specifics.
Here's a peek at the code:
const fs = require('fs');
const jwt = require('jsonwebtoken');
const uid = require('uid-safe');
const privateKey = fs.readFileSync('<the path to your privatekey.pem, for example: certificates/privatekey.pem>', 'utf8'); // Adjust the path and encoding
const signingOptions = {
keyid: '<the keyid. You can copy this when you create the JWT IDP on your cloud tenant>',
algorithm: 'RS256',
issuer: '<the issuer. You can copy this when you create the JWT IDP on your cloud tenant>',
expiresIn: '30s',
notBefore: '-30s',
audience: 'qlik.api/login/jwt-session',
};
const payload = {
jti: uid.sync(32),
sub: '<the subject of the user>',
subType: 'user',
name: '<full name of the user>',
email: '<e-mail address of the user>',
email_verified: true,
groups: []
};
const myToken = jwt.sign(payload, privateKey, signingOptions);
// Output the token
process.stdout.write(myToken);
To proceed, some aspects within the JWT payload need customization. Let's address these requirements:
sub: The "sub" field within the payload represents the user ID used to transmit the payload. Obtain this ID by accessing the URL https://yourcloudtenant/api/v1/users/me
after logging into your Qlik Cloud tenant. The URL will display relevant information on-screen; focus on the URL itself. You'll find a segment like "****-ugW-**-nGH8sRxq97ksBpwx." Remove the asterisks and utilize this value for the "sub" entry.
name and email: These fields are straightforward. Retrieve the user's full name and email from the same endpoint mentioned earlier. Alternatively, you can click the "i" icon in your user list within the Management Console to access this information.
keyid and issuer: These values are furnished during JWT IDP creation. When you set up the JWT Identity Provider, these details will be visible. Finally, the const privateKey
should point to the location of your .pem
file on the server.
Make sure to make these adjustments to ensure the JWT payload accurately reflects the user and adheres to the requirements of your Qlik Cloud integration.
This .js
file interacts with the JWT endpoint and prepares the authentication token.
Testing JWT Token Generation
Now comes the exciting part – verifying the successful generation of the JWT token. To do this, execute the following command in your terminal:
node <filename.js>
Upon running this command, your terminal will display the resulting JWT token, which appears as a lengthy encrypted string. This token is crucial for establishing secure authentication within your Qlik Cloud integration.
Here's a glimpse of the token (partially redacted for security reasons):Every character within this token holds encrypted information necessary for the secure operation of your integration.
eyJhbGciOi....7sNtjlad6CJA1AV2552GGMclPXBPdfXUyrxGzNvzN0L9Rr8jhB4AyrVVXUsBjgCo6VbF6WogxPbk7P07OPN-P9awEG_7oCVaBK9gJUXT4GSI8moy7VOMFHO6tzLennruCiHIMUhKy9L3ncXIzwe9vEvllZ1bvs
Remember, your Qlik Cloud experience is about to get smoother as this token plays a pivotal role in seamless and secure authentication. Now that you've got the token, it's time to move on to the next steps.
Step 11: Validating the JWT Token
You can verify the validity of the generated JWT token by using Postman, a versatile API testing tool. Follow these steps:
Download Postman if you haven't already.
Create a new POST request in Postman, directing it to https://yourcloudtenant/login/jwt-session
.
Set the necessary headers:
Content-Type: application/json
qlik-web-integration-id: <Your web integration ID>
Authorization: Bearer <Your JWT Token>
If all goes well, you'll receive a 200 OK status, indicating the successful authentication of the JWT token.
Step 12: Crafting the JWT Token Endpoint
To provide a convenient endpoint for code reference, you can create a simple .php
file that executes the node generate.js
command on your server and presents the JWT token in encoded JSON format.
Here's an example of the PHP code:
<?php
header('Content-Type: application/json');
echo json_encode(['body' => `node generate.js`]);
This endpoint URL will be something like https://www.example.com/qlikcloud/jwt_token.php
.
Step 13: Building the Iframe Display Page
Finally, let's build the index.php
/index.html
file that utilizes the JWT Endpoint to acquire and utilize the token for the iframe display. You'll need to replace placeholders with your actual details:
<html>
<head>
<script src="https://unpkg.com/enigma.js/enigma.min.js"></script>
</head>
<body>
<div id="main">
<div id="message"></div>
<iframe id='qlik_frame' style='border:none;width:100%;height:900px;'></iframe>
</div>
<script>
// CONFIGURATION
const TENANT = '<yourcloudtenant>';
const JWTENDPOINT = 'https://www.example.com/qlikcloud/jwt_token.php';
const WEBINTEGRATIONID = '<web integration id>';
const APPID = '<appid>;
const SHEETID = '<sheetid>';
const IDENTITY = '<not mandatory, fill in whatever you want>';
// MAIN
(async function main() {
const isLoggedIn = await qlikLogin();
const qcsHeaders = await getQCSHeaders();
const [session, enigmaApp] = await connectEnigma(qcsHeaders, APPID, IDENTITY);
handleDisconnect(session);
const theme = await getTheme(enigmaApp);
renderSingleIframe('qlik_frame', APPID, SHEETID, theme, IDENTITY);
})();
// LOGIN
async function qlikLogin() {
const loggedIn = await checkLoggedIn();
if (loggedIn.status !== 200) {
const tokenRes = await (await getJWTToken(JWTENDPOINT)).json();
const loginRes = await jwtLogin(tokenRes.body);
if (loginRes.status != 200) {
const message = 'Something went wrong while logging in.';
alert(message);
throw new Error(message);
}
const recheckLoggedIn = await checkLoggedIn();
if (recheckLoggedIn.status !== 200) {
const message = 'Third-party cookies are not enabled in your browser settings and/or browser mode.';
alert(message);
throw new Error(message);
}
}
console.log('Logged in!');
return true;
}
async function checkLoggedIn() {
return await fetch(`https://${TENANT}/api/v1/users/me`, {
mode: 'cors',
credentials: 'include',
headers: {
'qlik-web-integration-id': WEBINTEGRATIONID
},
})
}
// Get the JWT and use it to obtain Qlik Cloud session cookie.
async function getJWTToken(jwtEndpoint) {
return await fetch(jwtEndpoint, {
mode: 'cors',
method: 'GET'
})
}
async function jwtLogin(token) {
const authHeader = `Bearer ${token}`;
return await fetch(`https://${TENANT}/login/jwt-session?qlik-web-integration-id=${WEBINTEGRATIONID}`, {
credentials: 'include',
mode: 'cors',
method: 'POST',
headers: {
'Authorization': authHeader,
'qlik-web-integration-id': WEBINTEGRATIONID
},
})
}
async function getQCSHeaders() {
const response = await fetch(`https://${TENANT}/api/v1/csrf-token`, {
mode: 'cors',
credentials: 'include',
headers: {
'qlik-web-integration-id': WEBINTEGRATIONID
},
})
const csrfToken = new Map(response.headers).get('qlik-csrf-token');
return {
'qlik-web-integration-id': WEBINTEGRATIONID,
'qlik-csrf-token': csrfToken,
};
}
// ENIGMA ENGINE CONNECTION
async function connectEnigma(qcsHeaders, appId, identity) {
const [session, app] = await getEnigmaSessionAndApp(qcsHeaders, appId, identity);
return [session, app];
}
async function getEnigmaSessionAndApp(headers, appId, identity) {
const params = Object.keys(headers)
.map((key) => `${key}=${headers[key]}`)
.join('&');
return (async () => {
const schema = await (await fetch('https://unpkg.com/enigma.js@2.7.0/schemas/12.612.0.json')).json();
try {
return await createEnigmaAppSession(schema, appId, identity, params);
}
catch {
// Handle race condition with new users who do not have permissions to access the application. The code makes another attempt after a 1.5 seconds.
const waitSecond = await new Promise(resolve => setTimeout(resolve, 1500));
try {
return await createEnigmaAppSession(schema, appId, identity, params);
}
catch (e) {
throw new Error(e);
}
}
})();
}
async function createEnigmaAppSession(schema, appId, identity, params) {
const session = enigma.create({
schema,
url: `wss://${TENANT}/app/${appId}/identity/${identity}?${params}`
});
const enigmaGlobal = await session.open();
const enigmaApp = await enigmaGlobal.openDoc(appId);
return [session, enigmaApp];
}
// BONUS! DYNAMICALLY FETCH THEME
async function getTheme(enigmaApp) {
const createAppProps = await enigmaApp.createSessionObject({
qInfo: {
qId: "AppPropsList",
qType: "AppPropsList"
},
qAppObjectListDef: {
qType: "appprops",
qData: {
theme: "/theme"
}
}
});
const appProps = await enigmaApp.getObject('AppPropsList');
const appPropsLayout = await appProps.getLayout();
const theme = appPropsLayout.qAppObjectList.qItems[0].qData.theme;
return theme;
}
// HANDLE ENGINE SESSION CLOSURE
function handleDisconnect(session) {
session.on('closed', () => {
const message = '<Your text here> Due to inactivity or loss of connection, this session has ended.';
document.getElementById('qlik_frame').style.display = "none";
document.getElementById('message').innerHTML = message;
});
session.on('suspended', () => {
const message = '<Your text here> Due to loss of connection, this session has been suspended.';
document.getElementById('qlik_frame').style.display = "none";
document.getElementById('message').innerHTML = message;
});
window.addEventListener('offline', () => {
session.close();
});
}
// HELPER FUNCTION TO GENERATE IFRAME
function renderSingleIframe(frameId, appId, sheetId, theme, identity) {
const frameUrl = `https://${TENANT}/single/?appid=${appId}&sheet=${sheetId}&theme=${theme}&identity=${identity}&opt=ctxmenu,currsel`;
document.getElementById(frameId).setAttribute('src', frameUrl);
}
</script>
</body>
</html>
Customizing Integration Parameters
Before proceeding, let's customize a few essential parameters that will tie your integration together seamlessly. These values act as the foundation for your Qlik Cloud integration, enabling it to function smoothly. Let's take a moment to personalize them:
TENANT
: Replace with your Qlik Cloud tenant URL.JWTENDPOINT
: Set this to the endpoint where your JWT token will be generated, for instance: https://www.example.com/qlikcloud/jwt_token.php
.WEBINTEGRATIONID
: Input your unique web integration ID.APPID
: Insert the App ID of the Qlik Cloud app you want to embed.SHEETID
: Specify the Sheet ID of the specific sheet you wish to display.IDENTITY
: Optionally, add a unique identity to enhance tracking and management.With these values tailored to your integration's requirements, you're ready to proceed to the next step.
Witness Your Integration in Action
Now that everything is in place, it's time to witness the magic unfold. By serving your index.php
or index.html
file, you'll see your Qlik Cloud integration come to life.
Prepare to be amazed as the iframe showcases your chosen app and sheet, all seamlessly authenticated without any hassle. Sit back and enjoy the smooth presentation of data within the iframe.
Step 14: Conclusion
And there you have it! By following this thorough guide, you've accomplished the intricate process of embedding Qlik Cloud content using iframes and JWT authentication. Your users can now seamlessly access the content without encountering any login hurdles.
Remember, every placeholder like <...>
needs to be replaced with your specific details to make the entire process work seamlessly.
Enjoy the enhanced user experience and feel free to reach out if you encounter any challenges or need further assistance!