Forums for Qlik Data Integration solutions. Ask questions, join discussions, find solutions, and access documentation and resources
Share your best Qlik apps and discuss impacts with peers! Show your work and get recognized for innovative uses of Qlik technologies.
Get started on Qlik Community, find How-To documents, and join general non-product related discussions.
Direct links to other resources within the Qlik ecosystem. We suggest you bookmark this page.
Qlik gives qualified university students, educators, and researchers free Qlik software and resources to prepare students for the data-driven workplace.
The layout container, part of the Qlik Analytics dashboard bundle, is a new object that enables free layout of charts, the ability to stack visualizations, combine charts into new visualizations, as well as multi-select and group objects for easier sizing, positioning and alignment. The layout container is a great addition to the many easy to use and powerful capabilities available in Qlik Analytics
SaaS in 60
Last week I wrote about how the Above() function can be used for calculating rolling averages and other accumulations. There is however also an alternative method for doing the same thing:
The As-Of table.
When you use the Above() function, you fetch a number from other rows in a chart or Aggr() table. The As-Of table is slightly different in this respect: It is not a transient table created by an object or an expression – instead it is a real table in the data model.
The idea is to create a secondary month field – the AsOfMonth - that links to multiple real months.
In the example above, you can see that ‘2015 Oct’ links to several preceding months, and each Month in turn links to several rows in a fact table. This means that a specific transaction will be linked to several AsOfMonths.
In the data model, the As-Of table should appear as a separate calendar table that links to the existing primary calendar table:
One way to create this table is the following:
First, make sure that you in your master calendar have a field “Month” that is defined as the first date of the month, e.g.
Date(MonthStart(Date),'YYYY MMM') as Month,
Then add the following lines at the end of the script:
// ======== Create a list of distinct Months ========
tmpAsOfCalendar:
Load distinct Month
Resident [Master Calendar] ;
// ======== Cartesian product with itself ========
Join (tmpAsOfCalendar)
Load Month as AsOfMonth
Resident tmpAsOfCalendar ;
// ======== Reload, filter and calculate additional fields ========
[As-Of Calendar]:
Load Month,
AsOfMonth,
Round((AsOfMonth-Month)*12/365.2425) as MonthDiff,
Year(AsOfMonth)-Year(Month) as YearDiff
Resident tmpAsOfCalendar
Where AsOfMonth >= Month;
Drop Table tmpAsOfCalendar;
Once this table has been created, you can use the AsOfMonth as dimension in charts where you want rolling averages and accumulations.
If you as measure use
Sum({$<YearDiff={0}>} Sales)
you will get a yearly accumulation – year-to-date up until the day of the script run.
If you instead use
Sum({$<MonthDiff={"<6"}>} Sales) / Count(distinct {$<MonthDiff={"<6"}>} Month)
you will get a 6-month rolling average:
And finally, if you use
Sum({$<MonthDiff={0}>} Sales)
You will get the real, non-accumulated numbers.
I have made the Set Analysis expressions based on two fields: YearDiff and MonthDiff. However, for clarity it could be a good idea to add flags in the As-Of table, so that the Set Analysis expressions become even simpler, e.g.
If(MonthDiff=0,1,0) as IsSameMonth,
If(YearDiff=0,1,0) as IsSameYear,
If(MonthDiff<6,1,0) as IsRolling6,
Summary: The As-Of table is a good way to calculate rolling averages and accumulations.
Further reading related to this topic:
Calculating rolling n-period totals, averages or other aggregations
2023年 Gartner® データ統合ツールの Magic Quadrant™ が発表になりました。
Gartner 社は 21 社のデータ統合ベンダーを評価し、Qlik と Talend の両社を
データ統合分野でリーダーの 1 社に認定しました。
Talend は、これで連続 8 年目になります。
本レポートで、データ統合市場の全容とツールの簡単比較、各企業が Gartner® 社の
定める「ビジョンの完全性」と「実行能力」の基準に、どの程度適合しているのか
をご確認いただけます。
Edited 20th November 2023: CVE number updated.
Edited December 1st 2023: Added November 2023 IR release
Hello Qlik Users,
A security issue in Qlik Sense Enterprise for Windows has been identified, and patches have been made available. Details can be found in the Security Bulletin Critical Security fixes for Qlik Sense Enterprise for Windows (CVE-2023-48365).
Today, we have released eight service releases across the latest versions of Qlik Sense to patch the reported issues. All versions of Qlik Sense Enterprise for Windows prior to and including these releases are impacted:
No workarounds can be provided. Customers should upgrade Qlik Sense Enterprise for Windows to a version containing fixes for these issues. The listed fixes also address CV-2023-41266 and CVE-2023-41265 (link).
This issue only impacts Qlik Sense Enterprise for Windows. Other Qlik products including Qlik Cloud and QlikView are NOT impacted.
All Qlik software can be downloaded from our official Qlik Download page (customer login required). Follow best practices when upgrading Qlik Sense.
Qlik provides patches for major releases until the next Initial or Service Release is generally available. See Release Management Policy for Qlik Software. Notwithstanding, additional patches for earlier releases may be made available at Qlik’s discretion.
The information in this post and Security Bulletin Critical Security fixes for Qlik Sense Enterprise for Windows (CVE-2023-48365) is disclosed in accordance with our published Security and Vulnerability Policy.
What can be done to mitigate the issue?
No mitigation can be provided. An upgrade should be performed at the earliest. As per Qlik's best practices, the proxy should not be exposed to the public internet, which reduces the attack surface significantly.
What authentication methods are affected?
All authentication methods are affected.
Are environments with HTTP disabled impacted?
Environments will be affected regardless if HTTP or HTTPS are in use. These vulnerabilities affect the HTTP protocol overall, meaning even if HTTP is disabled, the environment remains vulnerable.
These attacks don’t rely on intercepting any communication, and therefore, are indifferent whether the HTTP communication is encrypted or not.
Kind regards, and thank you for choosing Qlik,
Qlik Global Support
Bookmarks allow users or developers to save a selection state in an app. They are useful when a user would like to save their selections in an app to view later or to share with others. For developers, bookmarks are useful if you would like to redirect a user to a specific sheet, with selections, when they open an app.
Let's begin by looking at how to create a bookmark. First, go to the sheet that you would like to bookmark and make sure your selections are already made. Then click on Bookmarks in the toolbar.
The Bookmarks window will open showing any bookmarks that have already been created. There is also a Create new bookmark button that can be clicked to create a new bookmark. After clicking the Create new bookmark button, the Create bookmark dialog window opens.
Here is where you can give the bookmark a name, add a description, and set some optional settings. By default, the Save sheet location is checked. This will save the sheet you are on when you created the bookmark and will navigate the user to this sheet when the bookmark is applied. The Save layout option will save the layout of the sheet and will apply that layout when the bookmark is applied. This is useful if you have made any layout changes on the sheet, such as expanding a pivot table or sorting a chart and want the bookmark to maintain the layout. The last option is to Save variable state in bookmarks. This option will save the current state of any variables when this bookmark is created. Once the bookmark is created, the user can use it at any time to return to the bookmarked state.
Let’s look at some other options that are available when you right click on the name of a bookmark.
There are also many ways to apply a bookmark. A bookmark can be applied from the app overview by clicking Bookmarks or the toolbar of a sheet. A bookmark can be indirectly applied by using a button, for instance, and setting the action to apply the bookmark.
Bookmarks can also be used in set analysis expressions and applied to a visualization. In the expression editor, the bookmark can be inserted as seen below or can be used by name.
Example expressions using the bookmark name:
These expressions will apply the bookmark to the chart, much like when you use set analysis.
A bookmark can be edited if the name, description, or selections in the bookmark need to change. There is no longer the need to delete and create a bookmark if a change needs to be made to the selections. To edit a bookmark, click on the Bookmarks button in the toolbar and then click the Details icon next to the bookmark you would like to edit or right click on the bookmark and select View details.
From the details view, you can select the edit icon to make changes to the name or description of the bookmark. Once done, you can click the check mark to stop editing or the update icon to also update the bookmark with the current selections.
Also note that possible alternate states are visible when editing a bookmark. If there were selections made in these states when the bookmark was created, they will be visible here as well. Users have the option to copy the set expression for the bookmark if they would like to use it elsewhere in the app.
Bookmarks are a great tool for your own personal use but also for collaboration. They are easy to create and edit and take the legwork out of remembering what selections were applied when you want to return to a previous viewed state or share an insight with a colleague. Check out Qlik Help to learn more.
Thanks,
Jennell
Edited August 30th, 15:55 CET: Added clarification on older Qlik Sense Enterprise on Windows versions
Edited August 31st, 13:10 CET: Added clarification on possible workarounds (none exist) as well as information regarding what authentication methods (all) are affected and that HTTP and HTTPS are impacted
Edited November 21st, 8:40 CET: Added clarification to apply the latest patches
Hello Qlik Users,
Two security issues in Qlik Sense Enterprise for Windows have been identified and patches made available. Details can be found in Security Bulletin Critical Security fixes for Qlik Sense Enterprise for Windows (CVE-2023-41266, CVE-2023-41265).
This announcement from August 2023 and the mentioned releases only cover CVE-2023-41266 and CVE-2023-41265. Apply the most recent patches as documented in Critical Security fixes for Qlik Sense Enterprise for Windows (CVE-2023-48365) (September 2023), which resolve CVE-2023-48365 as well.
Today, we have released five service releases across the latest versions of Qlik Sense to patch the reported issues. All versions of Qlik Sense Enterprise for Windows prior to and including these releases are impacted:
All prior versions of Qlik Sense Enterprise on Windows are affected, including releases such as May 2022, February 2022, and earlier. While no patches are currently listed for these versions, Qlik is actively investigating the possibility of patching older releases.
No workarounds can be provided. Customers should upgrade Qlik Sense Enterprise for Windows to a version containing fixes for these issues. August 2023 IR released today already contains the fix.
This issue only impacts Qlik Sense Enterprise for Windows. Other Qlik products including Qlik Cloud and QlikView are NOT impacted.
All Qlik software can be downloaded from our official Qlik Download page (customer login required). Follow best practices when upgrading Qlik Sense.
The information in this post and Security Bulletin Critical Security fixes for Qlik Sense Enterprise for Windows (CVE-2023-41266, CVE-2023-41265) are disclosed in accordance with our published Security and Vulnerability Policy.
What can be done to mitigate the issue?
No mitigation can be provided. An upgrade should be performed at the earliest. As per Qlik's best practices, the proxy should not be exposed to the public internet, which reduces the attack surface significantly.
What authentication methods are affected?
All authentication methods are affected.
Are environments with HTTP disabled impacted?
Environments will be affected regardless if HTTP or HTTPS are in use. These vulnerabilities affect the HTTP protocol overall, meaning even if HTTP is disabled, the environment remains vulnerable.
These attacks don’t rely on intercepting any communication, and therefore, are indifferent whether the HTTP communication is encrypted or not.
Kind regards, and thank you for choosing Qlik,
Qlik Global Support
Version 4.7. Current as of: 7th December 2023
Qlik and Talend, a Qlik company, may from time to time use the following Qlik and Talend group companies and/or third parties (collectively, “Subprocessors”) to process personal data on customers’ behalf (“Customer Personal Data”) for purposes of providing Qlik and/or Talend Cloud, Support Services and/or Consulting Services.
Qlik and Talend have relevant data transfer agreements in place with the Subprocessors (including group companies) to enable the lawful and secure transfer of Customer Personal Data.
Please note that, as of the date of the most recent version of this list, Qlik and Talend do not process Customer Personal Data on each other’s behalf (i.e., if you purchase Qlik offerings, your organization’s Customer Personal Data will only be processed by Qlik affiliates and Qlik third party subprocessors, and not those of Talend, and vice-versa).
You can receive updates to this Subprocessor list by subscribing to this blog or by enabling RSS feed notifications.
Subprocessors for Qlik offerings:
Third party subprocessors for Qlik Cloud |
||
Third Party |
Location of processing |
Service Provided |
Amazon Web Services (AWS) |
If EU region is chosen: - Ireland (Republic of); & Paris, France (back-up); or - Frankfurt, Germany; & Milan, Italy (back-up); or - London, UK; & Spain (back-up).
- Frankfurt, Germany (Blendr only). If US region is chosen: - North Virginia, US; & Ohio, US (back-up). Customer may select one of two APAC locations: - Sydney, Australia; & Melbourne, Australia (back-up); or - Singapore; & Seoul, South Korea (back-up ). |
Qlik Cloud is hosted through AWS |
MongoDB |
If EU region is chosen: - Ireland (Republic of);& Paris, France (back-up); or - Frankfurt, Germany; & Milan, Italy (back-up); or - London, UK; & Spain (back-up)
- Frankfurt, Germany (Blendr only). If US region is chosen: - North Virginia, US; & Ohio, US (back-up). Customer may select one of two APAC locations: - Sydney, Australia; & Melbourne, Australia (back-up); or - Singapore; & Seoul, South Korea (back-up). |
Any data inputted into the Notes feature in Qlik Cloud |
Third party subprocessors for Support Services and/or Consulting Services The vast majority of Qlik’s support data that it processes on behalf of customers is stored in Germany (AWS). However, in order to resolve and facilitate the support case, such support data may also temporarily reside on the other systems/tools below. |
||
Amazon Web Services (AWS) |
Germany |
Support case management tools |
Salesforce |
UK |
Support case management tools |
Grazitti SearchUnify |
United States |
Support case management tools |
Microsoft |
United States |
Customer may send data through Office 365 |
Ada |
Germany |
Support Chatbot |
Persistent |
India |
R&D Support Services |
Altoros |
United States |
R&D Support Services |
Ingima |
Israel |
R&D Support Services |
Galil |
Israel |
R&D Support Services |
ISS Consult |
Romania |
Support services for Blendr only |
Wipro |
India |
IT support services |
Third party subprocessors for mobile device apps |
|
|
Google Firebase |
United States |
Push notifications |
Qlik Affiliate Subprocessors These affiliates may provide services, such as Consulting or Support, depending on your location and agreement(s) with Qlik. Qlik’s Support Services are predominantly performed in the customer’s region: EMEA – Sweden, Spain, Israel; Americas – USA; APAC – Japan, Australia, India. |
|
Subsidiary Affiliate |
Location |
QlikTech International AB |
Sweden |
QlikTech Nordic AB |
Sweden |
QlikTech Latam AB |
Sweden |
QlikTech Denmark ApS |
Denmark |
QlikTech Finland OY |
Finland |
QlikTech France SARL |
France |
QlikTech Iberica SL (Spain) |
Spain |
QlikTech Iberica SL (Portugal liaison office) |
Portugal |
QlikTech GmbH |
Germany |
QlikTech GmbH (Austria branch) |
Austria |
QlikTech GmbH (Swiss branch) |
Switzerland |
QlikTech Italy S.r.l. |
Italy |
QlikTech Netherlands BV |
Netherlands |
QlikTech Netherlands BV (Belgian branch) |
Belgium |
Blendr NV |
Belgium |
QlikTech UK Limited |
United Kingdom |
Qlik Analytics (ISR) Ltd. |
Israel |
QlikTech International Markets AB (DMCC Branch) |
United Arab Emirates |
QlikTech Inc. |
United States |
QlikTech Corporation (Canada). |
Canada |
QlikTech México S. de R.L. de C.V. |
Mexico |
QlikTech Brasil Comercialização de Software Ltda. |
Brazil |
QlikTech Japan K.K. |
Japan |
QlikTech Singapore Pte. Ltd. |
Singapore |
QlikTech Hong Kong Limited |
Hong Kong |
Qlik Technology (Beijing) Limited Liability Company |
China |
QlikTech India Private Limited |
India |
QlikTech Australia Pty Ltd |
Australia |
QlikTech New Zealand Limited |
New Zealand |
Subprocessors for Talend offerings
Third party subprocessors for Talend Cloud |
||
Third Party |
Location of processing |
Service Provided |
Amazon Web Services (AWS) |
Talend Cloud AMERICAS: - Virginia, US; & Oregon, US (backup). EMEA: - Frankfurt, Germany; & Ireland (Republic of)(backup). APAC: - Tokyo, Japan; & Singapore (backup); or - Sydney, Australia; & Singapore (backup).
Stitch AMERICAS: - Virginia, US; & Oregon, US (backup). EMEA: - Frankfurt, Germany; & Ireland (Republic of) (backup). |
These Talend Cloud locations are hosted through AWS |
Microsoft Azure |
United States: California; Virginia (backup) |
These Talend Cloud locations are hosted through Microsoft Azure |
MongoDB |
See Talend Cloud locations above |
|
Third party subprocessors for Support Services and/or Consulting Services: In order to provide Support and/or Consulting Services, the following third party tools may be used. |
||
Sub-processor |
Data Center Location |
Service Provided |
Github |
United States |
Support ticket replication, troubleshooting |
Intercom |
United States |
In-app customer support messaging service |
Atlassian |
France United States |
Project management; support issue tracking |
Microsoft |
United States |
Email provider, if the Customer sends Customer Personal Data through email. |
Proofpoint Secure Share
|
United States
|
File sharing if Customer files include Customer Personal Data. |
Salesforce |
United States |
CRM; support case management |
Talend Affiliate Subprocessors
These affiliates may provide services, such as Consulting or Support, depending on your location and agreement(s) with Talend. Qlik’s Support Services are predominantly performed in the customer’s region: EMEA – France, Germany, UK, Americas – USA, Canada, APAC – China, Japan, Australia, India, Singapore.
Subsidiary Affiliate |
Location |
Talend Australia Pty Ltd. |
Australia |
Talend China Beijing Technology Co. Ltd. |
China |
Talend (Canada) Limited |
Canada |
Talend SAS |
France |
Talend Germany GmbH |
Germany |
Talend Data Integration Services Private Limited |
India |
Talend Italy S.r.l. |
Italy |
Talend Limited |
Ireland |
Talend KK |
Japan |
Talend Netherlands B.V. |
Netherlands |
Talend Sucursal Em Portugal |
Portugal |
Talend Singapore Pte. Ltd. |
Singapore |
Talend Spain, S.L. |
Spain |
Talend Sweden AB |
Sweden |
Talend GmbH |
Switzerland |
Talend Ltd. |
United Kingdom |
Talend, Inc. |
United States |
Talend USA, Inc. |
United States |
In addition to the above, other professional service providers may be engaged to provide you with professional services related to the implementation of your particular Qlik and/or Talend offerings; please contact your Qlik account manager or refer to your SOW on whether these apply to your engagement.
Qlik and Talend reserve the right to amend its products and services from time to time. For more information, please see www.qlik.com/us/trust/privacy and/or https://www.talend.com/privacy/.
As part of a scheduled maintenance for Qlik Community, there will be a downgraded experience while managing your support cases on Tuesday, December 12 during a span between 4-8 hours. We will share updates when complete.
Time Zone | Start Time |
CET | 9:00 AM |
CST | 2:00 AM |
EST | 3:00 AM |
PST | 12:00 AM |
Viewing your cases will be possible during this time if you are currently logged in, but you will not be able to add comments or create a new case. You will be unable to log back in if you have logged out.
Our support engineers will still have access to view and manage your cases and can make updates as needed on your behalf.
If you are experiencing any issue during this window, please contact us via chat. You will be able to access our chat and can be connected to an engineer by using our Support Chat on the Qlik Community Support Page.
For Production Down Severity 1 Issues, please use our Critical issue chat line for urgent assistance.
Choose Alert Here on our Support Page’s Critical Issue card.
Thanks,
Qlik Support
Hi everyone,
Want to stay a step ahead of important Qlik support issues? Then sign up for our monthly webinar series where you can get first-hand insights from Qlik experts.
Next Thursday, December 14th Qlik will host another Techspert Talks session, and this time we are looking at Exploring Data Essentials with Qlik Cloud.
But wait, what is it exactly?
Techspert Talks is a free webinar held every month, where you can hear directly from Qlik Techsperts on topics that are relevant to Customers and Partners today.
In this session, we will cover:
Click on this link to choose the webinar time that's best for you.
The webinar is hosted using ON24 in English and will last 30 minutes plus time for Q&A.
Hope to see you there!!
New data is being created everyday as we carry on our daily activities and interact with the world. There is possibly no industry that doesn’t deal with data.
Exponential growth of data opens the door of opportunities and the competitive advantage to those who know who to use it. A data-informed mindset is essential to deliver the insights and transform organizations.
With Qlik’s Analytics Expert Program “Applied Data Analytics using Qlik Sense”, you will not only learn data analytics best practices but also learn how to achieve a data-informed mindset that shifts you from just looking for data and information to looking for insights and knowledge.
You will learn best practices in data analytics, data literacy and data-informed decision making that help you make the most effective use of Qlik Sense. In just 15 weeks, you will be on your way to become a leader in developing a data-driven culture in your organization.
Learn More and Register to get started and save your spot today!
Transformational. Innovative. Powerful. These are just a few of the inspiring terms customers use to describe Qlik and our product suite. Customer feedback and input are critical to Qlik’s success, and we regularly hear directly from our most active users through Qlik Nation, our gamified customer engagement hub.
Through interactive challenges and activities, Qlik Nation members connect, learn, and engage with Qlik and with each other. By completing challenges in the platform, members can receive early notification of new product features, boost knowledge through quality educational content, and have opportunities to influence the product roadmap. Qlik Nation also allows customers to demonstrate skills, network with peers, and amplify their personal brand. All while having fun and being rewarded!
And now, Qlik Nation members will be able to complete challenges while visiting Qlik Community. Current members will see a new carousel on the Qlik Community homepage inviting them to complete challenges, making it even easier for you to engage and earn points.
Customers are at the heart of everything we do, and Qlik Nation offers an exclusive experience for our most dedicated and passionate fans (Qlik Nation is a complementary platform to Qlik Community, which has open membership). What do our members say about their experience in Qlik Nation?
If you’re a Qlik end-user and want to learn more about how to join Qlik Nation, please email QlikNation@qlik.com. We’d love to hear from you!
Great news!
Qlik has recently introduced conversational analytics in Microsoft Teams. With our new Teams app, you can easily chat with Insight Advisor, Qlik's intelligent AI assistant, to explore data using natural language directly within Teams.
Users can now ask questions through individual or group chat, and Qlik will respond with AI-generated data visualizations and insights using data from across your Qlik apps. And because it's Microsoft Teams, you can collaborate with others in real-time, collectively making decisions using the insights generated by Qlik. Insight Advisor within Teams provides a powerful new way to help more people find the right answers, make better decisions and collaborate together where and how they work.
Look for the Qlik app in the Teams App Store to get started.
We have put together resources to help you get started.
Video tutorials (SaaS in 60):
Introduction to Qlik Conversational Analytics in Teams
Microsoft Teams Integration Setup - Part 1
Microsoft Teams Integration Setup - Part 2
Qlik Documentation:
Accessing Insight Advisor Chat through external collaboration platforms
Exploring app content with conversational analytics in Microsoft Teams
Managing connections to external collaboration platforms
Availability of the Qlik chatbot app in Microsoft Teams
Thank you for choosing Qlik,
Qlik Support
I am happy to introduce our App Analyzer for Qlik Cloud, that can help answer these questions and more.
The app provides insights on:
The App Analyzer uses a single REST connection to iterate over application metadata endpoints within a tenant. The data retrieve can then be measured against tenant quotas and user-defined thresholds to empower Admins to act on the insights that the app reveals. To see the app in action, check out this demo:
A few things to note:
Check out Optimizing Qlik Sense SaaS Apps with App Analyzer for an in-depth dive into the App Analyzer.
The app as well as the configuration guide are available via GitHub, linked below.
Any issues or enhancement requests should be opened on the Issues page within the app’s GitHub repository.
Be sure to subscribe to the Qlik Support Updates Blog by clicking the green Subscribe button to stay up-to-date with the latest Qlik Support announcements. Please give this post a like if you found it helpful!
Kind regards,
Qlik Platform Architects
Additional Resources:
Techspert Talk: Optimizing Qlik Sense SaaS Apps with App Analyzer
Our other monitoring apps for Qlik Cloud can be found below.
If you’ve just installed Qlik Sense Enterprise, then this image probably looks familiar. Alternatively, Chrome might display The site's security certificate is not trusted, while Firefox may report This Connection is Untrusted.
By default, Qlik Sense uses a self-signed certificate to enable HTTPS access across both the Hub (https:// YourSenseServer/hub) and the Management Console (https://YourSenseServer/qmc). But self-signed certificates cannot be validated or trusted by web browsers and tend to prompt a warning message.
That's alright though. All we need is the following:
So, let’s get started.
What is the current certificate used for?
During the initial install, the Qlik Sense Repository Service creates a set of certificates. Their purpose is to:
Qlik Sense uses certificates to authenticate its service across all nodes. See the Qlik Sense Online Help for details. In addition, other products (such as Qlik NPrinting) require these certificates to be establish a connection.
Note: We will not modify, replace, or remove the originally created certificates. Doing so will break service communication.
What we’ll do instead is to add an additional one.
Certificate options, or: What type of certificate is right for me?
There are three possible types of certificates for us to use.
Requirements, or: What to look out for when getting your cert.
When support gets questions, they are most often related to a certificate missing the private key. Always verify the certificate comes bundled with one when you install it.
It’ll look like this:
As far as formats and algorithms are concerned, the following are confirmed to work with Qlik Sense:
Where to get a certificate and how to do a CSR?
The Certificate Authority you chose will have instructions for this, and if you are looking to get a self-signed one or one from your corporation's CA, then a local administrator can provide the certificate to you.
Either way, you are going to need to generate a Certificate Signing Request (CSR) to pass on to your CA. There are tools out there to get that done with, such as certreq from Microsoft (found here), and SSLhopper has a great article on that, which I often send to customers when they ask us about CSRs and how to do them.
Once you obtain the certificate, we'll move on to installing it and activating it in Qlik Sense. This will be done in three quick steps:
Importing the Certificate
As mentioned before, we are not replacing certificates. The already existing ones will not be deleted. Doing so would break service authentication between the individual Qlik Sense services and render the system… broken.
Step 1:
On the Qlik Sense node running the Qlik Sense Proxy, log on with the user running the Sense services. This is important since the certificate needs to be accessible for this account.
Step 2:
If the certificate was saved in the .pfx format, then all you need to do is double click the file. Follow the prompt to import the certificate into the Personal store.
Longer Step 2:
If you want to import it manually or verify if it was correctly installed, then we'll need to do a little more work.
Getting the Thumbprint
Well, since we are already in the MMC, let's open the freshly installed certificate again.
Configuring the Qlik Sense Proxy
Almost done!
Click Apply.
The Sense Proxy will now restart. During the restart, it will be using Windows API calls to correctly bind the new certificate to its SSL ports.
Verification, or: How to prove the certificate was accepted.
In the web browser:
When opening the Qlik Sense Hub or QMC, the certificate will now be displayed in the browser. This may look different depending on the web browser, but in Google Chrome you can click the padlock to the left of the URL to verify what certificate is used.
The information displayed needs to match the properties of the certificate you installed.
In the log files:
If you’d rather see what the Qlik Sense Proxy service is doing, then you can directly check up on that, too.
On the Proxy node, go to C:\ProgramData\Qlik\Sense\Log\Proxy\Trace and open the Security log file from just after the last start.
It will now print a slightly different message than before:
Security.Proxy.Qlik.Sense.Common.Security.Cryptography.LoggingDigester DOMAIN\_service Setting crypto key for log file secure signing: success
Security.Proxy.Qlik.Sense.Common.Security.Cryptography.SecretsKey DOMAIN\_service retrieving symmetric key from cert: success
Security.Proxy.Qlik.Sense.Common.Security.Cryptography.CryptoKey DOMAIN\_service setting crypto key: success
Security.Proxy.Qlik.Sense.Communication.Security.CertSetup 'CN=localhost' (08C871933A58E072FED7AD65E2DB6D5AD3EAF9FA) as SSL certificate presented to browser, which is a 3rd party SSL certificate
And that's it!
There isn't much more to it in a standard Qlik Sense Enterprise installation, but if you have more questions, then maybe a few of these articles can help:
I applied my certificate and it seems to be using it correctly, but browsers are still saying the Common Name is Invalid?
ERR_CERT_COMMON_NAME_INVALID when using 3rd party certificate
Qlik Sense keeps reverting to the default and complains it can't find a valid ssl certificate with the thumbprint.
The certificate may not have a Private key or the service account does not have access to it.
How to: Manage Certificate Private Key
The Qlik Sense Service account doesn't have admin privileges and the certificate is not accepted.
I hope that this was useful 😊 Stay tuned for an upcoming post where we’ll focus on QlikView and how to enable HTTPS for its AccessPoint, and don’t forget to subscribe to this blog for more content delivered by #QlikSupport. We’ll be watching for your comments and questions and we’ll to get back to you as soon as possible. Your feedback is always appreciated.
Modern Embedded Analytics solutions using Qlik offers a stack of open-source libraries to build customized analytical platforms backed up with the robustness of Qlik’s Associative Engine. These new mediums facilitates communication with the Engine and provides the flexibility to develop your own client and services or to integrate visualizations and mashups. Historically, Capability APIs have been extensively used to build mashups and perform application related operations. An alternative offering to Capability API-based operations is Nebula.js and Enigma.js.
To clarify the use of each of these libraries, let’s breakdown their functionalities in a simplistic way to help the developer community to get started with them.
In this tutorial, we are going to focus on a user scenario that involves applying both Enigma.js and Nebula.js. This way we can have a fair idea of their applications and get started with their immense capabilities.
User scenario: A company needs to build its own analytic solution while being able to leverage data from Qlik’s Associative engine and embed a couple of visualizations. The overall goal is to have some buttons in their webpage that would control the ‘dimensions’, allowing them to render the visualizations based on the buttons’ click. They have approached a similar type of situation in the native Qlik Sense using ‘variables’ and they would like the same capability in their own web solution.
Solution: Since the company has taken advantage of ‘variables’ in the native QS, a similar approach for their web solution using the two libraries would be the following:
Let’s go through each of the steps in detail to understand how we can implement them -
Step 1 — Creating Mashup template: In this tutorial, the focus is not on developing mashups but to implement the user scenario. A very simple tutorial to get started with developing mashups is on the official Qlik Developer site. Link: https://qlik.dev/tutorials/build-a-simple-mashup-using-nebulajs
Using the command line interface, we create a web project that has the following structure -
/src
Step 2 — Fetch the variable using Enigma.js: Like I discussed, Enigma.js allows us to communicate with the QIX engine and enables us to perform CRUD(create, read, update, delete) operations on QS apps and their entities. Since in this case, our target is to read a variable named ‘vChangeField’ from a QS app, we first create an object with list definition like below:
const variableListProps = {
qInfo: {
qId: "VariableList",
qType: "VariableList",
},
qVariableListDef: {
qType: "variable",
qData: {
tags: "/tags",
name: "/",
},
},
};
In order to create the list object that we have defined, we use the createSessionObject() method provided by Enigma.js. After that, the properties of the object, including dynamic properties are retrieved using the getLayout() function and passed on to a passVariable(layout) like below:
const variableListModel = await app
.createSessionObject(variableListProps)
.then((model) => model);
variableListModel.getLayout().then((layout) => {
passVariable(layout);
});
Now that we have the object properties in the layout, the next step is to retrieve the ‘variable’ inside our function passVariable() and use it on the ‘action-button’ that we will create for our mashup.
function passVariable(layout) {
const {
qVariableList: { qItems: data },
} = layout;
var pass_need = data[1].qName;
}
So, we finally have our desired variable stored in ‘pass_need’. This is all we had to do to fetch our variable using Enigma.js.
Step 3 — Load and register ‘action-button’ and other charts: Our next step is to load the QS objects, i.e., ‘action-button’ and ‘combo’ and ‘bar’ charts required as part of this use case in our mashup and, to do that we will leverage Nebula.js. So, let’s go to the template defined in the configure.js file where we can see the following:
import { embed } from '@nebula.js/stardust';
import barchart from '@nebula.js/sn-bar-chart';
const n = embed.createConfiguration({
context: {
theme: 'light',
language: 'en-US',
},
types: [
{
name: 'barchart',
load: () => Promise.resolve(barchart),
},
],
});
export default n;
In this file, we have the initial configuration needed from Nebula.js perspective. First, an embedclass is imported from the ‘@nebula.js/stardust’ package and then using the Configuration object, we define the guidelines for our visualization and website. We also see that a bar-chart module is loaded from the package and then registered under types. This is what needs to be done to render our QS visualizations.
For this use case, we need three action buttons , and one bar and one combo chart in our mashup. So, let’s load and register these three visualization components as shown in the snippet below:
import { embed } from '@nebula.js/stardust';
import barchart from '@nebula.js/sn-bar-chart';
import actionButton from '@nebula.js/sn-action-button';
import combochart from '@nebula.js/sn-combo-chart';
const n = embed.createConfiguration({
context: {
theme: 'dark',
language: 'en-US',
},
types: [
{
name: 'barchart',
load: () => Promise.resolve(barchart),
},
{
name: 'action-button',
load: () => Promise.resolve(actionButton),
},
{
name: 'combochart',
load: () => Promise.resolve(combochart),
},
],
});
export default n;
Step 4 — Create the action-buttons and render: In our previous step, we loaded and registered the three types of visual components we need as part of this mashup. Now, we need to use the fetched variable ‘vChangeField’ in the three action-buttons and render them. First, let’s create a new embed instance using the Enigma app in the index.js file and then render the action-buttons using the render function. This function renders a visualization into an HTMLElement.
const n = embed(app);
function passVariable(layout) {
n.render({
type: "action-button",
element: document.querySelector(".object_new"),
properties: {
actions: [
{
actionType: "setVariable",
variable: pass_need,
value: "Decade",
},
],
style: {
label: "By Decade",
font: {
size: 0.7,
style: {
italic: true,
},
},
background: {
color: "Grey",
},
border: {
useBorder: true,
radius: 0.25,
width: 0.1,
color: "Grey",
},
icon: {},
},
},
}),
n.render({
type: "action-button",
element: document.querySelector(".object_new"),
properties: {
actions: [
{
actionType: "setVariable",
variable: pass_need,
value: "Actor",
},
],
style: {
label: "By Actor",
font: {
size: 0.7,
style: {
italic: true,
},
},
background: {
color: "Grey",
},
border: {
useBorder: true,
radius: 0.25,
width: 0.1,
color: "Grey",
},
icon: {},
},
},
}),
n.render({
type: "action-button",
element: document.querySelector(".object_new"),
properties: {
actions: [
{
actionType: "setVariable",
variable: pass_need,
value: "Director",
},
],
style: {
label: "By Director",
font: {
size: 0.7,
style: {
italic: true,
},
},
background: {
color: "Grey",
},
border: {
useBorder: true,
radius: 0.25,
width: 0.1,
color: "Grey",
},
icon: {},
},
},
});
}
While creating the three action-buttons, we also need to specify their properties under properties as seen above. One of the important things to consider when defining the properties for an ‘action-button’ is what action do we want this button to execute on click. This is very similar to what we do non-programmatically in the native Qlik Sense app with action buttons. For this use case, we want these buttons to set a variable when clicked. So, under actions we set the actionType: “setVariable” . The next step is to pass the variable that we have retrieved using Enigma.js. In Step 2, we stored the variable in ‘pass_need’. So, we will pass this to the variable: pass_need property. We also, set a default value for each variable using the property value: “Director”.
Step 5 — Integrate all the visualizations: Our final step is to bring all QS objects together in the mashup. We have already created and rendered the action buttons and, since we already have the combo and bar-chart in our QS app, we don’t need to create them but we will just retrieve them using their object IDs like the snippet below:
n.render({
element: document.querySelector(".object"),
id: "QVngr", //object ID of combo-chart
}),
n.render({
element: document.querySelector(".object"),
id: "JbtsBVB", //object ID of bar-chart
});
}
As we can see, the rendering of the charts changes on the basis of the ‘button’ we click. The button leverages the ‘variable’ defined in our Qlik Sense application. This tutorial presents an alternative to the Variable API used inherently for performing CRUD operations with variables. It also demonstrates the usefulness of Nebula and Enigma.js frameworks for developing modern embedded analytic solutions.
The Source code for this mashup can be found here: https://github.com/dipankarqlik/Variable_Enigma
Insights in a nutshell Although there are almost two times more movies than series in the Top 10, Series seem to be preferred by the audience. Indeed, the number of viewing in movies represents only 28% against the series. Squid Games is the show that performed the best: This series remained in rank 1 for 9 consecutives weeks and was the most viewed content throughout all categories. Most impressive, Squid Game was viewed two times more than Money Heist which sits on the second position of the top 10 most viewed contents. Red Notice remained for 3 consecutive weeks, making it the longest movie to stay rank 1 . The life span of the English language content is short lived compared to other language content shows within the Top 10. Moreover, Series remain for more weeks in the top 10 than movies. About a quarter of the total views of all the top 10 shows is represented within the top 5 series. After a movie or a series has been ranked n°1, it takes a hard drop off the charts. However, very few movies have shown it is always possible to reverse the curve at the end of their life span. Movies and series have a different dynamic: In general, we have only 1 movie at a time present in the top 10 excepted for few exceptions. Meanwhile, Series overlap each other in the top 10. We find multiple series in the top 10 in the same week.
Recommendations: Based on the dataset, we can make few recommendations: 1. Send out new adverts of Movies to watch one week after their releases through the different Netflix communication channels in order to avoid such a strong fall in the ranking and increase their optimization. 2. Release more series than movies to follow the trend. 3. Continue to release more and more foreign languages series as subscribers seem to enjoy them globally.
Netflix Marketing Team Netflix Subscribers
N/A
Ever found yourself stuck with a messy pile of data that seems more like a labyrinth than a pathway to clean insights? You're not alone. Today, we're diving into the world of data cleaning in Qlik Sense to help you uncover the analytical potential hiding behind your data.
Imagine you're baking a cake. Would you eyeball the measurements of your ingredients? Probably not, unless you fancy a disaster cake. Just like one poorly measured cup of flour can ruin your entire recipe, a small data error can throw off your whole analysis. That's why, before you dive into the fun part—data analysis—you've got to make sure your key ingredient (data) is as clean and precise as possible.
It's not just about tidying up; it's about quality control. Skipped steps or overlooked errors can lead to inaccurate results that could misinform your business decisions.
These types of tables behave differently than other tables in that they are stored in a separate area of the memory and are strictly used as mapping tables when the script is run, they are then automatically dropped.
Let’s take a look at how to do this and the different statements and functions that can be used:
CountryMap:
MAPPING LOAD * INLINE [
Country, NewCountry
U.S.A., US
U.S., US
United States, US
United States of America, US
];
Keep in mind that a mapping table must have two columns, the first containing the comparison values and the second contains the desired mapping values.
The ApplyMap function is used to replace data in a field based on a previously created Mapping Table.
CountryMap:
MAPPING LOAD * INLINE [
Country, NewCountry
U.S.A., US
U.S., US
United States, US
United States of America, US
];
Data:
LOAD
ID,
Name,
ApplyMap('CountryMap', Country) as Country,
Code
FROM [lib://DataFiles/Data.xlsx]
(ooxml, embedded labels, table is Sheet1);
The first parameter in ApplyMap is the Mapping Table name in quotes. The second parameter is the field containing the data that needs to be mapped.
You can add a third parameter to the ApplyMap function that serves as a default to handle cases when the value doesn’t match one in the Mapping Table.
For instance:
ApplyMap('CountryMap', Country, 'Rest of the world') As Country
after mapping:
ReplaceMap:
MAPPING LOAD * INLINE [
char, replace
")", ""
"(", ""
"\"", ""
"/", ""
"-", ""
] (delimiter is ',');
TestData:
LOAD
DataField as data,
MapSubString('ReplaceMap', DataField) as ReplacedString
INLINE [
DataField
"(415)555-1234",
"(415)543,4321",
"“510”123-4567",
"/925/999/4567"
] (delimiter is ',');
after cleaning:
Map Country Using CountryMap;
Data1:
LOAD
ID,
Name,
Country
FROM [lib://DataFiles/Data.xlsx]
(ooxml, embedded labels, table is Sheet1);
Data2:
LOAD
ID,
Country as Country2
FROM [lib://DataFiles/Data.xlsx]
(ooxml, embedded labels, table is Sheet1);
UNMAP;
UserData:
LOAD * INLINE [
UserID, FullName
1, "John,Doe"
2, "Jane,Doe"
3, "Alice,Wonderland"
4, "Bob,Builder"
];
CleanedData:
LOAD
UserID,
SubField(FullName, ',', 1) as FirstName,
SubField(FullName, ',', 2) as LastName
RESIDENT UserData;
Drop Table UserData;
Example 1:
Using a combination of the functions above to clean up a field. Let’s take a more complex field and try to extract the first name and last name.
UserData:
LOAD * INLINE [
UserID, Object
1, "37642UI101John.Doe"
2, "98322UI101Jane.Doe"
3, "45432UI101Alice.Wonderland"
4, "32642UI101Bob.Builder"
];
CleanedData:
LOAD
UserID,
SubField(Right(Object, Len(Object) - Index(Object, 'UI101') - 4), '.', 1) as FirstName,
SubField(Right(Object, Len(Object) - Index(Object, 'UI101') - 4), '.', 2) as LastName
RESIDENT UserData;
Drop Table UserData;
after cleaning:
Example 2:
Cleaning HTML in a field
Paragraphs:
LOAD * INLINE [
Paragraph_ID, Paragraph
1, "<p>This is a <strong>paragraph</strong>.</p><br><p>This is another <em>paragraph</em>.</p>"
];
// Loop through each paragrpah in the Paragraphs table
For vRow = 1 to NoOfRows('Paragraphs')
Let vID = Peek('Paragraph_ID', vRow-1, 'Paragraphs'); // Get the ID of the next record to parse
Let vtext = Peek('Paragraph', vRow-1, 'Paragraphs'); // Get the original paragraph of the next record
// Loop through each paragraph in place
Do While len(TextBetween(vtext, '<', '>')) > 0
vtext = Replace(vtext, '<br>', chr(10)); // Replace line breaks with carriage returns - improves legibility
vtext = Replace(vtext, '<' & TextBetween(vtext, '<', '>') & '>', ''); // Find groups with <> and replace them with ''
Loop;
// Store the cleaned paragraphs into a temporary table
Temp:
Load
$(vID) as Paragraph_ID,
'$(vtext)' as cleanParagraph
AutoGenerate 1;
Next vRow;
// Join the cleaned paragraphs back into the original Paragraphs table
Left Join (Paragraphs)
Load *
Resident Temp;
// Drop the temporary table
Drop Table Temp;
after cleaning:
I hope you found this post helpful!
Attached you can find a QVD that contains the scripts used in the post.
Happy data cleaning!
In the past few posts, I have discussed the modern, lightweight framework from Qlik, Nebula.js, and its usage in developing various Qlik Sense objects such as — creating a new visualization chart or building Embedded analytics solutions like Mashups. Nebula.js is a collection of product and framework agnostic JavaScript libraries and APIs that helps developers easily integrate out-of-the-box capabilities on top of the Qlik Associative Engine. So, let’s assume you have an already existing extension developed using the Extension API, and you would like to migrate this extension to the Nebula.js framework. How would you do this?
The focus of this post is to understand what resources are required by a Developer to effectively migrate an existing extension developed using Extension API to the Nebula.js framework and its potential benefits.
To demonstrate the overall migration process, I will take an existing visualization extension that I have developed in the past using the Extension API. The extension is a Scatter-Pie plot that allows us to visualize each scatter plot’s bubble as a pie-chart to understand the Sales-Profit correlation for the three categories state-wise as shown below:
Let us try to recreate the exact same visualization by leveraging the novel Nebula.js framework.
Traditional Extension API-based visualizations need JavaScript code (.js), a metadata file (.qext), and stylesheets (.css). The logic of the extension is controlled using the code in the JavaScript file and so that is the entry point for your development activity. In Nebula.js, we try to segregate the various modules of the code to align it to a modular architecture, thus providing greater flexibility to the developers. Let’s deep dive into the steps required to migrate our current extension.
Step 1: Create a Nebula project structure.
The first step is to get all the files required as part of developing the extension. We can use the Nebula.js CLI like below and structure the project.
npx @nebula.js/cli create hello --picasso none
Executing the above command, gives us the three required files.
Step 2: Starting the local development server.
One of the advantages of using the Nebula.js framework is that it comes with a local development server that allows developers to see their output as they code. To start the server, execute the following command.
cd hello
npm run start
Now, we will need to connect to Qlik’s Associative Engine using the WebSocket protocol. Once we establish a connection, we will be presented with the list of apps to test our visualization in the Developer UI.
Step 3: Configuring the data structure.
Next, we need to configure the data structure and define our hypercube as shown in the code snippet below. This is similar to what we have in our current extension (Extension API). However, all this information is saved under one single JavaScript file in the older approach.
const properties = {
showTitles: true,
qHyperCubeDef: {
qInitialDataFetch: [{ qWidth: 6, qHeight: 100 }],
},
definition: {
type: "items",
component: "accordion",
items: {
dimensions: {
uses: "dimensions",
min: 1,
max: 6,
},
measures: {
uses: "measures",
min: 2,
max: 2,
},
sorting: {
uses: "sorting",
},
settings: {
uses: "settings",
},
},
},
};
export default properties;
The only thing that we do differently here in Nebula is to add the /qHyperCubeDef as a data target in data.js like this:
export default {
targets: [{
path: '/qHyperCubeDef'
}],
};
Step 4: Import packages.
Nebula.js is built on the concept of custom hooks. If you do not know what hooks are, this tutorial will give you a high-level idea of how to leverage them in terms of Nebula. As per the Getting started with Nebula extension tutorial, we will need to import two essential packages to access the Qlik Sense object’s layout and render the data. They are useLayout and useEffect . Also, since our visualization is built using D3.js, we will need to install D3.js in our NodeJS environment and import the package for D3 using the commands below.
Now, let us understand what we did differently here in the Nebula.js framework as compared to the Extension API.
As we can see from the above comparative analysis, the primary difference is that the Extension API uses RequireJS to load resources asynchronously and has a jQuery wrapper around the HTML element. In the Nebula.js framework, we eliminate all these dependencies, thereby making it framework agnostic and faster.
Step 5: Code Logic.
Our most important goal is to develop the main functionality of the extension. Again, the whole idea here is to replicate the exact visualization developed using Extension API without investing additional time in rewriting the source code. The entry point for the extension’s code is the index.js file, and currently, it looks like below with all the necessary packages.
import { useLayout, useElement, useEffect } from "@nebula.js/stardust";
import properties from './object-properties';
import data from './data';
import * as d3 from "d3";
export default function supernova() {
return {
qae: {
properties,
data,
},
component() {
const element = useElement();
element.innerHTML = '<div>Hello!</div>'; // eslint-disable-line
},
};
}
Now, let’s take a look at our current extension’s JS code.
To render the visualization with an HTML element, we take advantage of the paint($element, layout) method where $element is a jQuery wrapper containing the HTML element and layout presents the data and properties for the visualization. This method is called every time the visualization is rendered. So, do we have a similar approach in the Nebula.js framework? The answer is Yes!
If we go back to our index.js file, we notice a function supernova( ) that consists of the component( ) method, which is where all the rendering takes place. To render something, we need to access the DOM element the visualization is assigned to, and to do so, we need to use the useElement method. Also, as I mentioned in Step 4, to access the QS object’s layout and bind the data, we need to use the useLayout and useEffect methods. These three methods are all we need to migrate our current code to the newer framework successfully.
After copying the current code and aligning it to Nebula’s programming standard, this is what we get.
export default function supernova() {
return {
qae: {
properties,
data,
},
component() {
const element = useElement();
const layout = useLayout();
const selections = useSelections();
console.log(layout)
//getting data array from QS object layout
useEffect(() => {
if (layout.qSelectionInfo.qInSelections) {
return;
}
var qMatrix = layout.qHyperCube.qDataPages[0].qMatrix;
var measureLabels = layout.qHyperCube.qMeasureInfo.map(function (d) {
return d.qFallbackTitle;
});
//an array that invokes each row of qMatrix from layout:
var data = qMatrix.map(function (d) {
return {
Dim1: d[0].qText,
Dim2: d[1].qText,
Dim3: d[2].qText,
Dim4: d[3].qText,
};
});
var width = 1000;
var height = 400;
var id = "container_" + layout.qInfo.qId;
const elem_new = `<div id=${id}></div>`;
element.innerHTML = elem_new;
viz(data, measureLabels, width, height, id);
}, [element, layout]);
}
}
}
We see that most of the code lines are similar to what we had in our Extension API. The only difference lies in the way we interact with the three methods here in Nebula.js. In the end, the viz( ) method is called within the component( ) function, and that is where our D3.js code for the visualization is. Again, this is similar to what we did in the Extension API. That is all we needed to do from a code perspective.
Step 6: Build & Deploy.
The good old extensions developed using Extension API had to be bundled (zipped) together with a .qext file, a JavaScript file, and any other dependency files. Nebula.js presents a modern way of building and preparing the extension to be deployed to the Qlik Sense environment.
Firstly, since Nebula runs in a NodeJS environment, we can easily bundle everything and distribute the visualization as an NPM package using:
npm run build
Secondly, to deploy the extension to a QS environment, we use the command below, which generates all files into the folder /hello-ext that you can then use as an extension in QS.
npm run sense
Now that we know about the resources required to migrate our existing extensions to the Nebula.js framework, let’s recap the potential benefits of the new SDK.
This brings an end to the post, and hopefully, this serves as an initial guide for developers and organizations planning to migrate their existing visualizations to the Nebula.js framework. For folks who want to get started with Nebula or build advanced visual representations using the SDK, here is a list of accumulated resources -
Let's face it - it usually takes a bit longer for features and capabilities of any product to gain traction in an organization. We released On Demand App Generation in 2018 with our Qlik Sense client-managed edition. Frankly I don't have much insight into whom has or has not implemented it. BUT, I can tell you from those that I have spoken with over the years, many were surprised to even see this awesome feature in the product when I brought it up.
However, in older versions, in order to enable it - there were a number of requirements which involved copying data load script along with inserting bindings and variables - which at first glance could be perceived as cumbersome. Even the first time I worked with it, I was a bit overwhelmed. This was true for others as well, so much so, that some Qlik enthusiast even developed web app add-ons and extensions to simplify the process and generate the template for you.
BUT....... since the release of ODAG, just like anything else, it has evolved and is now extremely simple to enable and implement. I show you this process in my latest Do More with Qlik (archive link below) session and summarize the ODAG concept in the latest Qlik Sense in 60 video embedded in this post - so please be sure to check them out. Let me know what you think in the comments below. Stay tuned to my next post where I build on what we learned about ODAG to introduce you to Dynamic Views!
On Demand App Generation - (ODAG - concept)
In summary, ODAG was originally developed to meet the need of analysis of very large data sets. The concept is quite simple:
ODAG Requirements Summarized
Qlik Sense in 60 - On Demand App Generation (video)
(Video transcript attached)
Help Topics
Source data:
https://www1.nyc.gov/site/tlc/about/tlc-trip-record-data.page
Presentation:
Do More with Qlik Session - you may need to register to access it:
https://gateway.on24.com/wcc/experience/eliteqliktech/1910644/2395144/do-more-with-qlik-for-beginners-and-beyond
Register:
https://pages.qlik.com/21Q3_QDEV_DA_GBL_DoMorewithQlikTargetpage_Registration-LP.html
Sample Apps attached - ODAG - Apps - Taxi Trips.zip - (Note you need to add your data connection and access SQL etc to your data sources)
Can't see the video? YouTube blocked by your region or organization? Download the .mp4 attached in this post to view this on your computer or mobile device.