Featured Content
-
How to contact Qlik Support
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical e... Show MoreQlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
- Support and Professional Services; who to contact when.
- Qlik Support: How to access the support you need
- 1. Qlik Community, Forums & Knowledge Base
- The Knowledge Base
- Blogs
- Our Support programs:
- The Qlik Forums
- Ideation
- How to create a Qlik ID
- 2. Chat
- 3. Qlik Support Case Portal
- Escalate a Support Case
- Phone Numbers
- Resources
Support and Professional Services; who to contact when.
We're happy to help! Here's a breakdown of resources for each type of need.
Support Professional Services (*) Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. - Error messages
- Task crashes
- Latency issues (due to errors or 1-1 mode)
- Performance degradation without config changes
- Specific questions
- Licensing requests
- Bug Report / Hotfixes
- Not functioning as designed or documented
- Software regression
- Deployment Implementation
- Setting up new endpoints
- Performance Tuning
- Architecture design or optimization
- Automation
- Customization
- Environment Migration
- Health Check
- New functionality walkthrough
- Realtime upgrade assistance
(*) reach out to your Account Manager or Customer Success Manager
Qlik Support: How to access the support you need
1. Qlik Community, Forums & Knowledge Base
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
The Knowledge Base
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
- Go to the Official Support Articles Knowledge base
- Type your question into our Search Engine
- Need more filters?
- Filter by Product
- Or switch tabs to browse content in the global community, on our Help Site, or even on our Youtube channel
Blogs
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)Our Support programs:
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.The Qlik Forums
- Quick, convenient, 24/7 availability
- Monitored by Qlik Experts
- New releases publicly announced within Qlik Community forums (click)
- Local language groups available (click)
Ideation
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation GuidelinesHow to create a Qlik ID
Get the full value of the community.
Register a Qlik ID:
- Go to register.myqlik.qlik.com
If you already have an account, please see How To Reset The Password of a Qlik Account for help using your existing account. - You must enter your company name exactly as it appears on your license or there will be significant delays in getting access.
- You will receive a system-generated email with an activation link for your new account. NOTE, this link will expire after 24 hours.
If you need additional details, see: Additional guidance on registering for a Qlik account
If you encounter problems with your Qlik ID, contact us through Live Chat!
2. Chat
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
- Answer common questions instantly through our chatbot
- Have a live agent troubleshoot in real time
- With items that will take further investigating, we will create a case on your behalf with step-by-step intake questions.
3. Qlik Support Case Portal
Log in to manage and track your active cases in the Case Portal. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
Your advantages:
- Self-service access to all incidents so that you can track progress
- Option to upload documentation and troubleshooting files
- Option to include additional stakeholders and watchers to view active cases
- Follow-up conversations
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Problem Type
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
Priority
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
Severity
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
Escalate a Support Case
If you require a support case escalation, you have two options:
- Request to escalate within the case, mentioning the business reasons.
To escalate a support incident successfully, mention your intention to escalate in the open support case. This will begin the escalation process. - Contact your Regional Support Manager
If more attention is required, contact your regional support manager. You can find a full list of regional support managers in the How to escalate a support case article.
Phone Numbers
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
- Qlik Data Analytics: 1-877-754-5843
- Qlik Data Integration: 1-781-730-4060
- Talend AMER Region: 1-800-810-3065
- Talend UK Region: 44-800-098-8473
- Talend APAC Region: 65-800-492-2269
Resources
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
Recent Documents
-
Top 10 Viz tips - part III - QlikWorld 2020
At QlikWorld 2020 I'm hosting a session called "Top 10 Visualization tips". Here's the app I used with all tips including test data. Tip titles, mo... Show MoreAt QlikWorld 2020 I'm hosting a session called "Top 10 Visualization tips". Here's the app I used with all tips including test data. Tip titles, more details in app:
* Charts *
Parliament diagram
Scatter with trackline
Calendar Graph - Month view
Calendar Graph - Year view
Meteogram
Spiral plot
Rank chart
Slope graph
Timeline chart
Candlestick chart
Range chart
Ridgeline chart
Stream Graph
Chord diagram
Coxcomb chart
Race chart* UI tweaks
Toolbar toggle
Hide toolbar
Hide selection bar
Hide sheet title
Hide popup buttons
Hide Pivot buttons
Hide search bar
Hide three nav
Watermark
Center titles
Highlight rows
Larger scrollbars* Dev tips*
Scatter overlap
Persistent colors
Color thresholds
Trellis container
Responsive and mobile tips
Include from Github
Tooltip table
Radial bar charts
100% bar charts
Title and text matters more than we think
Thumbnails
Quarterly month average
Image to chart
Magic quadrant
Link to appI want to emphasize that many of the tips are invented by others than me, I tried to credit the original author at all places when possible. Many of the tips have been published before on the Qlik Community, the app below can be viewed as my current top picks.
If you liked it, here's more in the same style:
- 24 days of visualization Season II, Season I
- Top 10 Tips Part VIII, VII, VI, V, IV, III, II , I
- Let's make new charts with Qlik Sense
- FT Visual Vocabulary Qlik Sense version
- Similar but for Qlik GeoAnalytics : Part III, II, I
Thanks,
Patric -
Qlik Talend Data Catalog: Configure Snowflake Bridge with Key Pair Authenticatio...
When transitioning from password-based authentication to key pair authentication for Snowflake, you may encounter issues during the configuration of i... Show MoreWhen transitioning from password-based authentication to key pair authentication for Snowflake, you may encounter issues during the configuration of import models, especially when using encrypted private key files generated with OpenSSL v3. A common error observed during connection attempts is:
Fatal: MITI.MIRException: Connection to the database with URL 'jdbc❄️//.snowflakecomputing.com' failed: JWT token is invalid.
java.security.NoSuchAlgorithmException: 1.2.840.113549.1.5.13 SecretKeyFactory not available
java.security.InvalidKeyException: IOException : DER input, Integer tag error
The Java process fails to complete due to an invalid JWT token, commonly associated with unsupported key formats or missing runtime parameters.
Cause
The error typically occurs due to one or more of the following factors:
- The JDBC driver does not natively support OpenSSL v3-encrypted private keys without enabling BouncyCastle.
- Required parameters (User, Password, Private key file) are not correctly configured in the import model settings.
- Miscellaneous Java options required to support the key format were not passed to the process.
Resolution
When using an unencrypted private key file:
User: Set to the Snowflake username.
Password: Enter the password
Private Key File: Provide the path to the unencrypted .pem file.
Miscellaneous Parameters: No additional Java options are required.
When using an encrypted private key file (e.g., OpenSSL v3-generated):
User: Set to the Snowflake username.
Password: Enter the passphrase used to encrypt the private key.
Private Key File: Provide the full path to the encrypted .pem file.
Miscellaneous Parameters: You must enable BouncyCastle support for decryption:
-Dnet.snowflake.jdbc.enableBouncyCastle=true
For further details, please refer to the official Snowflake documentation: Snowflake Key-Pair Authentication Guide.
Related Content
Qlik Talend Product: How to set up Key Pair Authentication for Snowflake in Talend Studio
Environment
- Talend Data Integration 8.0.1
-
Introducing Automation Sharing and Collaboration
This capability is being rolled out across regions over the period of May 5 - 8: May 5: India, Japan, Middle East, Sweden May 6: Asia Pacific, German... Show MoreThis capability is being rolled out across regions over the period of May 5 - 8:
- May 5: India, Japan, Middle East, Sweden
- May 6: Asia Pacific, Germany, United Kingdom, Singapore
- May 7: United States
- May 8: Europe
- June: Qlik Cloud Government (exact date to be announced)
With the introduction of shared automations, it is now possible to create, run, and manage automations in shared spaces.
Content
- Allow other users to run an automation
- Collaborate on existing automations
- Collaborate through duplication
- Extended context menus
- Context menu for owners:
- Context menu for non-owners:
- Monitoring
- Administration Center
- Activity Center
- Run history details
- Metrics
Allow other users to run an automation
Limit the execution of an automation to specific users.
Every automation has an owner. When an automation runs, it will always run using the automation connections configured by the owner. Any Qlik connectors that are used will use the owner's Qlik account. This guarantees that the execution happens as the owner intended it to happen.
The user who created the run, along with the automation's owner at run time, are both logged in the automation run history.
These are five options on how to run an automation:
- Run an automation from the Hub and Catalog
- Run an automation from the Automations activity center
- Run an automation through a button in an app
You can now allow other users to run an automation through the Button object in an app without needing the automation to be configured in Triggered run mode. This allows you to limit the users who can execute the automation to members of the automation's space.
More information about using the Button object in an app to trigger automation can be found in How to run an automation with custom parameters through the Qlik Sense button. - Programmatic executions of an automation
- Automations API: Members of a shared space will be able to run the automations over the /runs endpoint if they have sufficient permissions.
- Run Automation and Call Automation blocks
- Note for triggered automations: the user who creates the run is not logged as no user specific information is used to start the run. The authentication to run a triggered automation depends on the Execution Token only.
Collaborate on existing automations
Collaborate on an automation through duplication.
Automations are used to orchestrate various tasks; from Qlik use cases like reload task chaining, app versioning, or tenant management, to action-oriented use cases like updating opportunities in your CRM, managing supply chain operations, or managing warehouse inventories.
Collaborate through duplication
To prevent users from editing these live automations, we're putting forward a collaborate through duplication approach. This makes it impossible for non-owners to change an automation that can negatively impact operations.
When a user duplicates an existing automation, they will become the owner of the duplicate. This means the new owner's Qlik account will be used for any Qlik connectors, so they must have sufficient permissions to access the resources used by the automation. They will also need permissions to use the automation connections required in any third-party blocks.
Automations can be duplicated through the context menu:
As it is not possible to display a preview of the automation blocks before duplication, please use the automation's description to provide a clear summary of the purpose of the automation:
Extended context menus
With this new delivery, we have also added new options in the automation context menu:- Start a run from the context menu in the hub
- Duplicate automation
- Move automation to shared space
- Edit details (owners only)
- Open in new tab (owners only)
Context menu for owners:
Context menu for non-owners:
Monitoring
The Automations Activity Centers have been expanded with information about the space in which an automation lives. The Run page now also tracks which user created a run.
Note: Triggered automation runs will be displayed as if the owner created them.
Administration Center
The Automations view in Administration Center now includes the Space field and filter.
The Runs view in Administration Center now includes the Executed by and Space at runtime fields and filters.
Activity Center
The Automations view in Automations Activity Center now includes Space field and filter.
Note: Users can configure which columns are displayed here.
The Runs view in the Automations Activity Center now includes the Space at runtime, Executed by, and Owner fields and filters.
In this view, you can see all runs from automations you own as well as runs executed by other users. You can also see runs of other users's automations where you are the executor.
Run history details
To see the full details of an automation run, go to Run History through the automation's context menu. This is also accessible to non-owners with sufficient permissions in the space.
The run history view will show the automation's runs across users, and the user who created the run is indicated by the Executed by field.
Metrics
The metrics tab in the automations activity center has been deprecated in favor of the automations usage app which gives a more detailed view of automation consumption.
-
Advanced Qlik Sense System Monitoring
Content ChaptersEnvironment overviewZabbix Server set-upClone the Zabbix docker repositorySetting up environment variablesRe-link compose.yaml to our ... Show MoreContent
- Chapters
- Environment overview
- Zabbix Server set-up
- Clone the Zabbix docker repository
- Setting up environment variables
- Re-link compose.yaml to our preferred compose file
- Logging in for the first time
- Zabbix Agent installation on Windows Server
- Zabbix Server Configuration
- Adding the first Server
- Importing Qlik Sense Enterprise for Windows templates
- Linking templates to hosts
- Engine Healthcheck Monitoring with HTTP Agent example
- Steps to configure a new HTTP Agent for QSE Health monitoring
- Defining the Virtual Proxy prefix for Zabbix HTTP Agent
- Resources & Links
Chapters
- 01:33 - Why use Zabbix
- 02:35 - Architecture for demo
- 03:41 - Downloading the installer
- 04:36 - Installing Zabbix Server
- 08:37 - Installing the Zabbix agent
- 12:17 - Applying Qlik specific templates
- 14:28 - Reviewing Qlik-specific Dashboards
- 16:49 - Configuration details
- 18:42 - How to create a dashboard
- 20:30 - Q&A: Can Zabbix run on Windows?
- 21:16 - Q&A: Is Zabbix supported by Qlik?
- 21:36 - Q&A: Can this monitor data capacity?
- 22:45 - Q&A: Can the Zabbix agents affect performance?
- 23:20 - Q&A: Can it monitor bookmark size?
- 24:02 - Q&A: Can this monitor amount of data being used?
- 24:19 - Q&A: Can this monitor sheets, and objects in apps?
- 24:49 - Q&A: Is there a similar tool for Cloud?
- 25:36 - Q&A: Would this work with QlikView?
- 26:11 - Q&A: Does this read the app data?
- 26:26 - Q&A: Can this help measure how long to open an app?
The information in this article and video is provided as is. If you need assistance with Zabbix, please engage with Zabbix directly.
Environment overview
The environment being demonstrated in this article consists of one Central Node and Two Worker Nodes. Worker 1 is a Consumption node where both Development and Production apps are allowed. Worker 2 is a dedicated Scheduler Worker node where all reloads will be directed. Central Node is acting as a Scheduler Manager.
Zabbix Server set-up
The Zabbix Monitoring appliance can be downloaded and configured in a number of ways, including direct install on a Linux server, OVF templates and self-hosting via Docker or Kubernetes. In this example we will be using Docker. We assume you have a working docker engine running on a server or your local machine. Docker Desktop is a great way to experiment with these images and evaluate whether Zabbix fits in your organisation.
Clone the Zabbix docker repository
This will include all necessary files to get started, including docker compose stack definitions supporting different base images, features and databases, such as MySQL or PostgreSQL. In our example, we will invoke one of the existing Docker compose files which will use PostgreSQL as our database engine.
Source: https://www.zabbix.com/documentation/current/en/manual/installation/containers#docker-compose
git clone https://github.com/zabbix/zabbix-docker.git
Setting up environment variables
Here you can modify environment variables as needed, to change things like the Stack / Composition name, default ports and many other settings supported by Zabbix.
cd ./zabbix-docker/env_vars ls -la #to list all hidden files (.dotfiles) nano .env_web
In this file, we will change the value for
ZBX_SERVER_NAME
to something else, like "Qlik STT - Monitoring". Save the changes and we are ready to start up Zabbix Server.Re-link compose.yaml to our preferred compose file
./zabbix-docker folder contains many different docker compose templates, either using public images or locally built (latest and local tags).
You can run your chosen base image and database version with:
docker compose -f compose-file.yaml up -d && docker compose logs -f --since 1m
Or unlink and re-create the symbolic link to compose.yaml, which enables managing the stack without specifying a compose file. Run the following commands inside the
zabbix-docker
folder to use the latest Ubuntu-based image with PostgreSQL database:unlink compose.yaml
ln -s ./docker-compose_v3_ubuntu_pgsql_latest.yaml compose.yaml
- Start the Zabbix stack in detached mode with
docker compose up -d
If you skip the
-d
flag, the Docker stack will start and your command line will be connected to the log output for all containers. The stack will stop if you exit this mode with CTRL+C or by closing the terminal session. Detached mode will run the stack in background. You can still connect to the live log output, pull logs from history, manage the stack state or tear it down usingdocker compose down
.Pro tip: you will be using
docker compose
commands often when working with Docker. You can create an alias in most shells to a short-hand, such as "dc = docker compose". This will still accept all following verbs, such asstart|stop|restart|up|down|logs
and all following flags.docker compose up -d && docker compose logs -f --since 1m
would becomedc up -d && dc logs -f --since 1m
.Logging in for the first time
- By default, the Zabbix Web GUI will be exposed on ports 80/443
- Using tools like Portainer makes Docker stack management easier
Use the IP address of your Docker host: http://IPADDRESS or https://IPADDRESS.
The Zabbix server stack can be hosted behind a Reverse Proxy.
The default username is
Admin
and the default password iszabbix
. They are case sensitive.Zabbix Agent installation on Windows Server
Download link: https://www.zabbix.com/download_agents, in this case download the Windows installer MSI.
- Run the installer .msi
- Leave components unchanged
- Hostname = your machine hostname, we will have to use the same hostname when adding a Host in Zabbix Server.
- Zabbix server IP/DNS: IP address or DNS name of your Zabbix Server
- Agent listening port, the same port will be used when when adding a Host in Zabbix Server.
- Enable "Add agent location to the PATH" for convenience in the command line
- Finish installation
Zabbix Server Configuration
Adding the first Server
After Agent is installed, in Zabbix go to Data Collection > Hosts and click on Create host in the top right-hand corner. Provide details like hostname and port to connect to the Agent, a display name and adjust any other parameters. You can join clusters with Host groups. This makes navigating Zabbix easier.
Fig 1: Adding a Host
Note: Remember to change how Zabbix Server will connect to the Agent on this node, either with IP address or DNS. Note that the default IP address points to the Zabbix Server.
Importing Qlik Sense Enterprise for Windows templates
In the Zabbix Web GUI, navigate to Data Collection > Templates and click on the Import button in the top right-hand corner. You can find the templates file at the following download link:
LINK to zabbix templates
Linking templates to hosts
Once you have added all your hosts to the Data Collection section, we can link all Qlik Sense servers in a cluster using the same templates. Zabbix will automatically populate metrics where these performance counters are found. From Data Collection > Hosts, select all your Qlik Sense servers and click on "Mass update". In the dialog that comes up, select the "Link templates" checkbox. Here you can link/replace/unlink templates across many servers in bulk.
Select "Link" and click on the "Select" button. This new panel will let us search for Template groups and make linking a bit easier. The Template Group we provided contains 4 individual templates.
Fig 2: Mass update panel
Fig 3: Search for Template Group
Once you Select and Update on the main panel, all selected Hosts will receive all items contained in the templates, and populate all graphs and Dashboards automatically.
To review your data, navigate to Monitoring > Hosts and click on the "Dashboards" or "Graphs" link for any node, here is the default view when all Qlik Sense templates are linked to a node:
Fig 4: Host Dashboards
Fig 5: Repository Service metrics - Example
Engine Healthcheck Monitoring with HTTP Agent example
We will query the Engine Healthcheck end-point on QlikServer3 (our consumer node) and extract usage metrics from by parsing the JSON output.
Steps to configure a new HTTP Agent for QSE Health monitoring
We will be using a new Anonymous Access Virtual Proxy set up on each node. This Virtual Proxy will only Balance on the node it represents, to ensure we extract meaningful metrics from the Engine and we won't be load-balanced by the Proxy service across multiple nodes. There won't be a way to determine which node is responding, without looking at DevTools in your browser. You can also use Header or Certificate authentication in the HTTP Agent configuration.
Once the Virtual Proxy is configured with Anonymous Only access, we can use this new prefix to configure our HTTP Agent in Zabbix.
Defining the Virtual Proxy prefix for Zabbix HTTP Agent
In the Zabbix web GUI, go to Data collection > Hosts. Click on any of your hosts. On tabs at the top of the pop-up, click on Macros and click on the "Inherited and host macros" button. Once the list has loaded, search for the following Macro: {$VP_PREFIX}. This is set by default to "anon". Click on "Change" and set Macro value to your custom Virtual Proxy Prefix for Engine diagnostics, and click Update. The Virtual Proxy prefix will have to be changed on each node for the "Engine Performance via HTTP Agent" item to work. Alterantively, you can modify the MACRO value for the Template, this will replicate the changes across all nodes associated to this Template.
Fig 6: Changing Host Macros from Inherited values
To make this change at the Template level, go to Data collection > Templates. Search for the "Engine Performance via HTTP Agent" and click on the Template. Navigate to the Macros tab in the pop-up and add your Virtual Proxy Prefix here to make this the new default for your environment. No further changes to Node configuration are required at this point.
Fig 7: Changing Macros at the Template level
The Zabbix templates provided in this article contain the following Engine metric JSONParsers:
- Memory: Allocated, Committed, Free, Total Physical
- Calls, Selections
- Saturation status (true/false)
- Sessions: Active/Total
- Users: Active/Total
These are the same performance counters that you can see in the Engine Health section in QMC.
Stay tuned to new releases of the Monitoring Templates. Feel free to customise these to your needs and share with the Community.
Resources & Links
- Zabbix home page
- Zabbix Installation from containers documentation
- Zabbix Docker repository on GitHub
- Install Docker Engine on Ubuntu
Environment
- Qlik Sense Enterprise on Windows
-
Core based licenses in Qlik Sense
Core-based and capacity-based licensing is a change from traditional licensing. Rather than limiting the number of users who can access the Qlik Sense... Show MoreCore-based and capacity-based licensing is a change from traditional licensing. Rather than limiting the number of users who can access the Qlik Sense site by a limit on tokens, core-based licensing restricts user access by limiting the number of CPU cores which can be used by the Engine to deliver end-user content.
With a core-based license, there is no functional reason to use the License Monitor app since the users will not use tokens. Instead, consumption of apps can be gauged by reviewing the Session Overview and Session Details sheets on the Operations Monitor.
Environments:- Qlik Sense Enterprise on Windows with Core-based license
After applying the license, the administrator should review the Engine configuration to ensure that sufficient capacity has been assigned to the Engine(s) in the Qlik Sense site.
In this example, 256 cores are allocated to the license but the Central only has 8 available cores:After adjusting the Cores to allocate to 8, we then have 248 cores which can be allocated across RIM nodes, if applicable.
See also Get Started with APIs on Windows > Qlik Analytics Platform on help.qlik.com.
To prevent user access security rules must be created. -
Qlik NPrinting: Days to Keep / Reports to Keep Setting Ignored
While running Qlik NPrinting Tasks, many reports are generated. However, the setting for Days to Keep / Reports to Keep seems to be ignored. Environ... Show MoreWhile running Qlik NPrinting Tasks, many reports are generated. However, the setting for Days to Keep / Reports to Keep seems to be ignored.
Environment:
Qlik NPrinting all versions
The first thing to consider is that Qlik NPrinting generates one report for each Newsstand user. For this reason, there will be one file for each user, and not "just one", in the Qlik NPrinting Application Data folder. Those reports may be absolutely identical.
In that sense, the "report to keep" setting does not apply here, and should be considered as "report to keep per user". There is currently an improvement at R&D. Check improvement on days to keep in Nprinting for details. In addition, see Publish task "Reports to Keep" does not work during first 24 hours.
As for the setting apparently being ignored, by default runs the check to delete old reports once every 24 hours. If no new report has been run in the meantime, the old reports will not be deleted, even if more than one day has passed.
Should there be a feeling that the periodic check is not run, the settings can be enforced:- Stop the Qlik NPrinting services Proper Order to Restart Qlik NPrinting Services
- Navigate to and open C:\Program Files\NPrintingServer\NPrinting\Scheduler\scheduler.config
- Add the below lines before the </appSettings> closing tag
<add key="cleanup-period" value="1800" /> <add key="cleanup-period-files" value="43200" />
- Restart the services
The first setting is the recurrence in seconds (default: 30 minutes) for the task that will check for reports in the newsstand and unlink the older ones.
The second setting is the recurrence in seconds (default: 24 hours) for the task that will scan and remove the actual files, after checking that the conditions about days and reports to keep, plus the presence of newer reports that supersede them, are met. In this case, we would change the default value to a shorter time (43200 seconds, 12 hours) to make sure the task is run more frequently. -
Lef expires or Lef expired displayed in QlikView Management console alert banner
Opening the QlikView Management Console (QMC) shows an orange banner across the top: Lef expires DD/MM/YYYY Environments: QlikView any version E... Show MoreOpening the QlikView Management Console (QMC) shows an orange banner across the top:
Lef expires DD/MM/YYYY
Environments:- QlikView any version
Expired Maintenance
Your maintenance support period has or is going to soon expire. Note that this can apply for both a QlikView Server license as well as a QlikView Publisher license.
Review your LEF information for the following tag:
Perpetual License:
- PRODUCTLEVEL;10;;2019-01-31
- PRODUCTLEVEL;30;;2019-01-31
Subscription License:
- TIMELIMIT;VALUE;2021-06-30;2022-06-30
To solve:
If you have not renewed your maintenance contract, this is expected and working as designed. Although the banner reads, the system remains working, but cannot be upgraded or supported by Qlik.
If your contract has been renewed, do the following:
- Go to the QlikView Management Console
- Navigate to System > Licenses
- Choose your QVS@node/QMS@node (check both) and open the QlikView Server Licenses tab (Note: Be sure to check the Publisher license key as well.)
- On the bottom, click: Update License from Server
- Restart the QlikView services.
NOTE:
- Be sure to check the Publisher license key as well. If the Publisher LEF has expired it will also display the orange banner. You will need to update the License as well.
- If the license has been extended with a time limit, example - TIMELIMIT;VALUE;;2022-05-21 the product will work until the mentioned time limit date how ever the banner will still show the expired message in the management console, this is work as design.
Signed License Key applied
Another root cause may be a known issue with Signed key License on QlikView 12.40, or improper permissions assigned to Qlik Service Dispatcher. Product Defect ID(s): QV-17628
To solve:
If you are using version 12.40, note that there is a new service called Qlik Service Dispatcher. An admin account needs to be applied to resolve the issue. Qlik Management Service (QMS) may also need to be restarted as a workaround.
This issue is resolved in 12.50.
-
Qlik Replicate 2024.5 GA Build 144: Store Changes using the new use_manipulation...
With Store Changes (see Change Processing) turned on for CDC tasks, Qlik Replicate tasks may crash or stop unexpectedly with the new feature use_manip... Show MoreWith Store Changes (see Change Processing) turned on for CDC tasks, Qlik Replicate tasks may crash or stop unexpectedly with the new feature use_manipulation_pk_for_apply enabled.
The following errors may be logged in the task log:
2024-07-30T09:53:22:497564 [AT_GLOBAL ]E: An exception occurred!!! (win32_exception_handler.c:109)
2024-07-30T09:53:22:499561 [AT_GLOBAL ]E: Backtrace at exception: !{C:\Program Files\Attunity\Replicate\bin\at_base.dll!4537ab...
2024-07-30T09:53:22:499561 [AT_GLOBAL ]E: exception code is 3221225477 (win32_exception_handler.c:112)
2024-07-30T09:53:22:499561 [AT_GLOBAL ]E: tid=22112 (win32_exception_handler.c:115)
2024-07-30T09:53:22:499561 [AT_GLOBAL ]E: exception as string is EXCEPTION_ACCESS_VIOLATION (win32_exception_handler.c:118)
Environment
- Qlik Replicate 2024.5.144 GA & 2024.5.247 PR1
Resolution
Fix Version
Upgrade to 2024.5 SP02 (for Windows and Linux) when available.
Workaround
To work around the issue:
- Go to Task Settings...
- Open the Full Load tab
- In Full Load Settings set If target table already exists: DROP and CREATE table
- Back in Task Settings... open the More Options tab
- Add the following feature:
use_manipulation_pk_for_apply
value: Off
Cause
Product Defect ID: QB-28312
Information provided on this defect is given as is at the time of documenting. For up to date information, please review the most recent Release Notes, or contact support with the ID QB-28312 for reference.
-
Qlik Replicate: ORA-01555: snapshot too old: rollback segment number string with...
Loading data from Oracle may fail on a full load with the error: ORA-01555: snapshot too old: rollback segment number string with name "string" too sm... Show MoreLoading data from Oracle may fail on a full load with the error:
ORA-01555: snapshot too old: rollback segment number string with name "string" too small
Resolution
This is an Oracle configuration issue which must be resolved for the task to be able to continue.
In Automatic Undo Management mode, increase the setting of UNDO_RETENTION. Otherwise, use larger rollback segments.
You can verify your current settings:
SHO PARAMETER UNDO;
SELECT SUM(BYTES)/1024/1024 "MB", TABLESPACE_NAME FROM DBA_FREE_SPACE GROUP BY TABLESPACE_NAMEVerify how large the problematic table is and what the current settings are. Then increase the sizes as per your findings.
Oracle references:
ORA-01555 - Database Error Messages
ORA-01555 "Snapshot too old" - Detailed Explanation
snapshot too old errorCause
It caused by rollback records needed by a reader being overwritten by other writers.
Environment:
-
Stitch Migration to Qlik Cloud
This Techspert Talks session will address: Understanding the schemas Demonstration of the migration process Best practices and tips for a smooth tran... Show MoreThis Techspert Talks session will address:
- Understanding the schemas
- Demonstration of the migration process
- Best practices and tips for a smooth transition
Chapters:
- 01:05 - How to learn about Qlik Talend Cloud
- 01:52 - Why migrate to Qlik Talend Cloud
- 02:42 - Feature difference highlights
- 03:08 - Using the Migration Inventory Tool
- 06:58 - First look at Qlik Talend Cloud Data Integration
- 07:33 - Creating the first Project and Space
- 08:21 - Creating the Klaviyo Source connection
- 09:54 - Creating the Target connection
- 10:52 - Choosing the task settings
- 12:59 - Viewing Pipeline tasks in action
- 13:55 - Target table differences
- 15:19 - Creating the my SQL source
- 19:24 - Q&A: Will QTC features be added to Stitch?
- 20:04 - Q&A: What is the QTC warehouse architecture?
- 21:45 - Q&A: How to build incremental loads?
- 22:28 - Q&A: Can QTC load data from Marketo?
- 23:37 - Q&A: Why are the schemas different?
- 24:26 - Q&A: Where is the list of data sources?
- 25:23 - Q&A: Where can I get a test account to try it?
Resources:
- About Qlik Talend Cloud subscriptions
- Technical feature comparison
- Performing an inventory analysis
- Qlik Ideation
- Schema Differences in data loading
-
Qlik Talend Product: How to set up Key Pair Authentication for Snowflake in Tale...
This guide briefly offers a step-by-step process on how to set up key-pair authentication for Snowflake in Talend Studio at Job level The process can ... Show MoreThis guide briefly offers a step-by-step process on how to set up key-pair authentication for Snowflake in Talend Studio at Job level
The process can be summarized in three steps:
- Creating the .p12 file with Open SSL
- Configuring Snowflake
- Configuring Talend Studio at Job Level
Creating the .p12 File with Open SSL
The .p12 file contains both the private and public keys, along with the owner's details (such as name, email address, etc.), all certified by a trusted third party. With this certificate, a user can authenticate and identify themselves to any organization that recognizes the third-party certification.
Talend tSetKeyStore component itself can only take in .jks or .p12/.pfx format. If you are using PKCS8 format, you need to convert your p8 certs into a supported format.
-
Generate the key with the following command line prompt:
openssl genpkey -algorithm RSA -out private.key -aes256
This will generate a private key (private.key) using the RSA algorithm with AES-256 encryption. You'll be prompted to enter a passphrase to protect the private key. - Generate a self-signed certificate using the following command line prompt:
openssl req -new -x509 -key private.key -out certificate.crt -days 1825
This command generates a self-signed certificate (certificate.crt) that is valid for 5 years. You will be prompted to enter details like country, state, and organization when generating the certificate. - Once you have both the private key (private.key) and certificate (certificate.crt), please create the .p12 file using the following command line and name your key alias.
openssl pkcs12 -export -out keystore.p12 -inkey private.key -in certificate.crt -name "abe"
And check the created .p12 file information with below command:
openssl pkcs12 -info -in keystore.p12 or keytool -v -list -keystore keystore.p12
- Generate a public key with the following command line:
openssl x509 -pubkey -noout -in certificate.crt > public.key
Configuring Snowflake :
The
USERADMIN
role is required to perform the Snowflake configuration. Open your Snowflake environment and ensure you have a worksheet or query editor ready to execute the following SQL statements. .- For this step, you will create the necessary Snowflake components—database, warehouse, user, and role for testing purposes. If you already have an existing setup or example, feel free to re-use it
-- Drop existing objects if they exist DROP DATABASE IF EXISTS ABE_TALEND_DB; -- Drop the test database DROP WAREHOUSE IF EXISTS ABE_TALEND_WH; -- Drop the test warehouse DROP ROLE IF EXISTS ABE_TALEND_ROLE; -- Drop the test role DROP USER IF EXISTS ABE_TALEND_USER; -- Drop the test user -- Create necessary objects CREATE WAREHOUSE ABE_TALEND_WH; -- Create the warehouse CREATE DATABASE ABE_TALEND_DB; -- Create the test database CREATE SCHEMA ABE_TALEND_DB.ABE; -- Create the schema "ABE" in the test database -- Create the test user CREATE OR REPLACE USER ABE_TALEND_USER PASSWORD = 'pwd!' -- Replace with a secure password LOGIN_NAME = 'ABE_TALEND_USER' FIRST_NAME = 't' LAST_NAME = 'tt' EMAIL = 't.tt@qlik.com' -- Replace with a valid email MUST_CHANGE_PASSWORD = FALSE DEFAULT_WAREHOUSE = ABE_TALEND_WH; -- Grant necessary permissions GRANT USAGE ON WAREHOUSE ABE_TALEND_WH TO ROLE SYSADMIN; -- Grant warehouse access to SYSADMIN CREATE ROLE IF NOT EXISTS ABE_TALEND_ROLE; -- Create the custom role GRANT ROLE ABE_TALEND_ROLE TO USER ABE_TALEND_USER; -- Assign the role to the user GRANT ALL PRIVILEGES ON DATABASE ABE_TALEND_DB TO ROLE ABE_TALEND_ROLE; -- Full access to the database GRANT ALL PRIVILEGES ON ALL SCHEMAS IN DATABASE ABE_TALEND_DB TO ROLE ABE_TALEND_ROLE; -- Full access to all schemas GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA ABE_TALEND_DB.ABE TO ROLE ABE_TALEND_ROLE;-- Full access to all tables in schema GRANT USAGE ON WAREHOUSE ABE_TALEND_WH TO ROLE ABE_TALEND_ROLE; -- Grant warehouse usage to custom role -- Verify user creation SHOW USERS; -- Create a test table and validate setup CREATE TABLE ABE_TALEND_DB.ABE.ABETABLE ( NAME VARCHAR(100) ); -- Test data retrieval SELECT * FROM ABE_TALEND_DB.ABE.ABETABLE;
- For this step, please assign the public key to the Snowflake test user created earlier. To do this, you'll need do the following:
- Locate public.key and open it in an editor (such as Notepad++)
- Copy the public key displayed between BEGIN PUBLIC KEY and END PUBLIC KEY
- In the Snowflake environment, open a worksheet or query editor to run the following SQL statements. You will add the previously generated public key to our user and be sure to replace it with your own key.
DESCRIBE USER
And to verify that the key was successfully added.ALTER USER ABE_TALEND_USER SET RSA_PUBLIC_KEY=public key '; DESCRIBE USER ABE_TALEND_USER;
- Now we’ll verify that the configuration is correct. In your Snowflake environment, open a worksheet or query editor, run the following SQL statements, and copy the results (an sha256 hash of our public key ) into a Notepad or any text editor for reference.
DESC USER ABE_TALEND_USER; SELECT SUBSTR((SELECT "value" FROM TABLE(RESULT_SCAN(LAST_QUERY_ID())) WHERE "property" = 'RSA_PUBLIC_KEY_FP'), LEN('SHA256:') + 1);
Using OpenSSL, we will calculate the SHA-256 hash of the public key and compare it with the one previously generated by Snowflake to ensure they are matched.
To do that use the following OpenSSL command:openssl rsa -pubin -in public.key -outform DER | openssl dgst -sha256 -binary | openssl enc -base64
If the hash matches, proceed to Talend Studio configuration.
Configuring Talend Studio at Job Level :
- Launch your Talend Studio and drag both tSetKeyStore and tDBConnection(Snowflake) components from Palette to Designer Tab
- In the Basic settings of tSetKeyStore component, enter the path to the keystore .p12 file in double quotation marks in the KeyStore file field :
- Use the Key Alias set in the keystore. p12 file before for Snowflake DB Connection ("abe", for this example) :
- Please test the connection to see if the key-pair authentication you set up works
Related Content
Talend-Job-using-key-pair-authentication-for-Snowflake-fails
Environment
Talend Studio 8.0.1
-
Not able to create data spaces with Qlik Cloud Analytics license
Customers with a Qlik Cloud Analytics license cannot create or edit data spaces. When creating a space, only the options for managed or shared spaces ... Show MoreCustomers with a Qlik Cloud Analytics license cannot create or edit data spaces.
When creating a space, only the options for managed or shared spaces are available:
Resolution
Data Spaces are part of the Qlik Talend Data Integration offering.
Despite the name, “data spaces” are not meant to store Qlik Cloud Analytics data. Qlik Cloud Analytics customers should store their datasets in managed and shared spaces.
To address early misunderstandings and assist customers who had originally stored their datasets there, Qlik has decided to allow analytics customers to keep using data spaces with the following workaround.
IMPORTANT
- The decision to allow this workaround might be reverted at any time, preceded by a timely communication.
- Analytics-only customers who never made use of data spaces should refrain from starting now.
- Analytics-only customers who are already working with data spaces should consider planning to move away from them, migrating their datasets and connections and should not start implementing new workflows involving data spaces.
- Additional Qlik Talend Data Integration-only features, like the possibility to work with pipelines or the Qlik Data Movement Gateway, are not available to Analytics-only customers.
To create and edit spaces:
- Open the Qlik Cloud main menu
- Go to Data Integration
- Open Projects
You can edit existing datasets or create new datasets from the Projects Activity Center.
Environment
- Qlik Cloud Analytics
-
Databricks Endpoint Connection Error after upgrading Qlik Replicate: SSL_connect...
Following an upgrade of Qlik Replicate from version 2023.05 to 2024.05 and an upgrade of the Databricks ODBC drivers from version 2.6.22 to 2.8.2, the... Show MoreFollowing an upgrade of Qlik Replicate from version 2023.05 to 2024.05 and an upgrade of the Databricks ODBC drivers from version 2.6.22 to 2.8.2, the following error is encountered when configuring and testing the Databricks endpoint:
SYS-E-HTTPFAIL, Failed prepare Cloud component. SYS,GENERAL_EXCEPTION,Failed prepare Cloud component,Cannot connect to Cloud server RetCode: SQL_ERROR SqlState: HY000 NativeError: 14 Message: [Simba][ThriftExtension] (14) Unexpected response from server during a HTTP connection: SSL_connect: certificate verify failed. Failed to find field. Field named at object
Resolution
If you are using your organization's root certificates with the ODBC driver 2.8.2, then you should add the root certificates to the cacerts.pem file, which is located in the ODBC driver directory.
For more information, see: Magnitude Simba Apache Spark ODBC Data Connector (.pdf download).
If this option is not set, the connector defaults to using the trusted CA certificates .pem file installed by the connector. To use the trusted CA certificates in the .pem file, set the UseSystemTrustStore property to 0 or clear the Use System Trust Store check box in the SSL Options dialog.
Cause
- In version 2.6.22 of the ODBC driver, the default setting for UseSystemTrustStore was 1, meaning the driver checked the SSL certificate in the Windows trust store.
- In version 2.8.2, the default setting for UseSystemTrustStore was changed to 0, so the driver now uses the trusted CA certificates in the .pem file rather than referencing the Windows trust store.
Environment
- Qlik Replicate
-
Qlik Replicate: how to resolve KAFKA timestamp -1 values
Timestamp values may be written to Kafka as " -1 “. How can this be resolved? Resolution To avoid the negative timestamps, add the Internal Paramete... Show MoreTimestamp values may be written to Kafka as " -1 “. How can this be resolved?
Resolution
To avoid the negative timestamps, add the Internal Parameter rdkafkaProperties to the endpoint connection, using the value: api.version.request=true;api.version.fallback.ms=0;
- Stop the task (if running)
- Go to the Kafka Endpoint connection
- Switch to the Advanced tab
- Click Internal Parameters
- Add:
- Parameter: rdkafkaProperties
- Value: api.version.request=true;api.version.fallback.ms=0;
More than one value can be added. If you have previously added the rdkafkaProperties parameter and have an active value, follow the current value with a semicolon (;) before appending the new one.
For more information on Internal Parameters, see Qlik Replicate: How to set Internal Parameters and what are they for?
Example: OLD_VALUE;NEW_VALUE
- Confirm with OK
- Save the endpoint
- Reload the target
Environment
- Qlik Replicate
- KAFKA endpoint
-
How to get started with the Qlik Cloud Catalog connector in Qlik Application Aut...
Qlik Cloud Catalog is a native connector that lets you automate tasks related to the Data Products Catalog and Data Quality workflows. With this conne... Show MoreQlik Cloud Catalog is a native connector that lets you automate tasks related to the Data Products Catalog and Data Quality workflows. With this connector, you can enable the scheduling of quality computations for datasets, streamlining data validation processes across your organization. Aligning Data Quality execution with ELT or ETL processes helps you assess the trustworthiness of your data, especially as it may be consumed downstream by analytics, AI models, or other consumers.
Contents
- Authentication
- Use cases
- 1: Schedule the quality compute for dataset
- 2: Get notified upon quality compute status
Authentication
This connector does not require additional configuration to authenticate, it will automatically connect to the automation owner's Qlik account. Whenever blocks of this connector are executed, they will use that account. Additional blocks (like retrieving data products, or getting quality indicators) will be released over time, you can request new ones through ideation.
Use cases
For the initial release, the connector introduces two main capabilities:
-
Schedule quality computation for selected datasets
-
Send notifications based on the computation result
1: Schedule the quality compute for dataset
Once your datasets are registered in Qlik Cloud Catalog, you can use Qlik App Automation to schedule their quality computation with custom parameters. This ensures data quality stays aligned with your freshness and operational needs.
You can configure the computation mode:
-
Pushdown (for Snowflake and Databricks datasets): computation runs on the cloud data warehouse side (note: it consumes data warehouse credits).
-
Pullup (Qlik Cloud): computation runs in Qlik Cloud.
Both modes allow you to define a sample size. Pullup uses a head sample; pushdown uses a random sample.
To set this up:
-
Use the trigger data quality computation block.
-
Specify the dataset id (found in the dataset's details panel in Qlik Cloud Catalog).
-
Configure mode (pushdown or pull-up) and sampling options.
-
Add the trigger mode to the start block of your automation, this is where you can schedule it.
2: Get notified upon quality compute status
In order to know whenever your automation ran successfully or when you might need to perform actions in case of failure, you can add blocks to your automation in order to push an alert to the system of your choice. In the template we propose you to send a notification to a Slack channel.
To monitor your automation results:
- Use the wait for quality compute block, which loops and monitors computation progress.
You can then trigger alerts based on outcomes:
- Send a Slack notification if the computation fails
- Or notify every successful computation if preferred.
Future updates will allow threshold-based alerts, letting you trigger actions based on data quality indicators results.
-
Qlik Talend Studio: The Calling Function via tDBInput can not INSERT/UPDATE data...
After upgrading from Talend Version 7 to Version 8, you may encounter an issue where a Custom SQL function (Mixed read/write SQL) previously invoked v... Show MoreAfter upgrading from Talend Version 7 to Version 8, you may encounter an issue where a Custom SQL function (Mixed read/write SQL) previously invoked via tDBInput component in Talend Version 7 and containing both INSERT/UPDATE and SELECT statements no longer performs the INSERT/UPDATE operations in Talend V8.
Even the function still returns data, though the changes to the target tables are not applied well as they did in Talend Version 7 before.
Resolution
Please use tDBRow and tParseRecordSet components to support the mixed read/write SQL function call
Cause
Mixed read/write SQL function execution is no longer supported via tDBInput component in the latest Studio Version.
Related Content
combining-two-flows-for-selective-output-standard-component-in-this
Environment
-
Qlik Talend Data Integration: Jobs failing to execute due to the error java.lang...
The primary error is displayed as follows in Studio/Remote Engine/Job Server Logs: Error: LinkageError occurred while loading main class <job>_<versio... Show MoreThe primary error is displayed as follows in Studio/Remote Engine/Job Server Logs:
Error: LinkageError occurred while loading main class <job>_<version>.<job> java.lang.UnsupportedClassVersionError: <project>/<job>_<version>/<job> has been compiled by a more recent version of the Java Runtime (class file version 61.0), this version of the Java Runtime only recognizes class file versions up to 55.0
Cause
A review of the error shows
“…compiled by a more recent version of the Java Runtime (class file version 61.0)”
A review of the Java Runtime class table will show what version of the Job was compiled. In this case, it's Java 17.
JAVA SE Major Version 8 52 11 55 17 61 Resolution
Configure the Job execution on the Remote Engine/Job Server to utilize Java 17. For further details, please refer to the documentation page Remote Engine configuration guide's section on setting the JAVA execution path.
Environment
- Talend Data Integration 8.0.1
-
Qlik Cloud Analytics: Data profile in Dataset is not completely refreshed after ...
A Qlik application has been successfully reloaded in a tenant. The reload has stored additional tables in a QVD. Reviewing the Dataset (QVD) in the Ca... Show MoreA Qlik application has been successfully reloaded in a tenant. The reload has stored additional tables in a QVD.
Reviewing the Dataset (QVD) in the Catalog does not show the correct number of rows after the reload. The information is not automatically updated.
The rows only update once the Compute button is clicked.
Resolution
This is currently working as expected.
Qlik plans to provide scheduling capabilities for the Profile and Data Quality compute. No estimated release date or other details can yet be determined for this feature.
Cause
Profiling information is not automatically refreshed when QVD files change.
Internal Investigation ID(s)
SUPPORT-2319
Environment
- Qlik Cloud Analytics
-
Qlik NPrinting and the CVE-2025-32433 Erlang/OTP vulnerability
Erlang/Open Telecom Platform (OTP) has disclosed a critical security vulnerability: CVE-2025-32433. Is Qlik NPrinting affected by CVE-2025-32433? Reso... Show MoreErlang/Open Telecom Platform (OTP) has disclosed a critical security vulnerability: CVE-2025-32433.
Is Qlik NPrinting affected by CVE-2025-32433?
Resolution
Qlik NPrinting installs Erlang OTP as part of the RabbitMQ installation, which is essential to the correct functioning of the Qlik NPrinting services.
RabbitMQ does not use SSH, meaning the workaround documented in Unauthenticated Remote Code Execution in Erlang/OTP SSH is already applied. Consequently, Qlik NPrinting remains unaffected by CVE-2025-32433.
All future Qlik NPrinting versions from the 20th of May 2025 and onwards will include patched versions of OTP and fully address this vulnerability.
Environment
- Qlik NPrinting
-
Configure Qlik Sense Mobile for iOS and Android
The Qlik Sense Mobile app allows you to securely connect to your Qlik Sense Enterprise deployment from your supported mobile device. This is the proce... Show MoreThe Qlik Sense Mobile app allows you to securely connect to your Qlik Sense Enterprise deployment from your supported mobile device. This is the process of configuring Qlik Sense to function with the mobile app on iPad / iPhone.
This article applies to the Qlik Sense Mobile app used with Qlik Sense Enterprise on Windows. For information regarding the Qlik Cloud Mobile app, see Setting up Qlik Sense Mobile SaaS.
Content:
- Pre-requirements (Client-side)
- Configuration (Server-side)
- Update the Host White List in the proxy
- Configuration (Client side)
Pre-requirements (Client-side)
See the requirements for your mobile app version on the official Qlik Online Help > Planning your Qlik Sense Enterprise deployment > System requirements for Qlik Sense Enterprise > Qlik Sense Mobile app
Configuration (Server-side)
Acquire a signed and trusted Certificate.
Out of the box, Qlik Sense is installed with HTTPS enabled on the hub and HTTP disabled. Due to iOS specific certificate requirements, a signed and trusted certificate is required when connecting from an iOS device. If using HTTPS, make sure to use a certificate issued by an Apple-approved Certification Authority.
Also check Qlik Sense Mobile on iOS: cannot open apps on the HUB for issues related to Qlik Sense Mobile on iOS and certificates.
For testing purposes, it is possible to enable port 80.(Optional) Enable HTTP (port 80).
- Open the Qlik Sense Management Console and navigate to Proxies.
- Select the Proxy you wish to use and click Edit Proxy.
- Check Allow HTTP
Update the Host White List in the proxy
If not already done, add an address to the White List:
- In Qlik Management Console, go to CONFIGURE SYSTEM -> Virtual Proxies
- Select the proxy and click Edit
- Select Advanced in Properties list on the right pane
- Scroll to Advanced section in the middle pane
- Locate "Allow list"
- Click "Add new value" and add the addresses being used when connecting to the Qlik Sense Hub from a client. See How to configure the WebSocket origin allow list and best practices for details.
Generate the authentication link:
An authentication link is required for the Qlik Sense Mobile App.
- Navigate to Virtual Proxies in the Qlik Sense Management Console and edit the proxy used for mobile App access
- Enable the Client authentication link menu in the far right menu.
- Generate the link.
NOTE: In the client authentication link host URI, you may need to remove the "/" from the end of the URL, such as http://10.76.193.52/ would be http://10.76.193.52
Associate User access pass
Users connecting to Qlik Sense Enterprise need a valid license available. See the Qlik Sense Online Help for more information on how to assign available access types.
Qlik Sense Enterprise on Windows > Administer Qlik Sense Enterprise on Windows > Managing a Qlik Sense Enterprise on Windows site > Managing QMC resource > Managing licenses- Managing professional access
- Managing analyzer access
- Managing user access
- Creating login access rules
Configuration (Client side)
- Install Qlik Sense mobile app from AppStore.
- Provide authentication link generated in QMC
- Open the link from your device (this can be also done by going to the Hub, clicking on the menu icon at the top right and selecting "Client Authentication"), the installed application will be triggered automatically, and the configuration parameters will be applied.
- Enter user credentials for QS server