Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Search our knowledge base, curated by global Support, for answers ranging from account questions to troubleshooting error messages.
This article provides an overview of how to manage users using Qlik Application Automation. This approach can be useful when migrating from QlikView, or Qlik Sense Client Managed, to Qlik Sense Cloud when security concerns prevent the usage of Qlik-CLI and PowerShell scripting.
You will find an automation attached to this article that works with the Microsoft Excel connector. More information on importing automation can be found here.
Content
In this example, we use a Microsoft Excel file as a source file to manage users. A sheet name, for example, Users, must be added and this must also be provided as input when running the automation. The sheet must also contain these headers: userId, Name, Subject, Email, Roles, Licence, and Flag.
Example of sheet configuration:
If users are to be created the Flag column must be set to create. If users are to be deleted, there's no need to include roles, but Flag must be set to delete.
Add the List Rows With Headers block from the Microsoft Excel connector to read the values that have been configured in the Excel sheet.
When running the automation you must provide input to the automation, this includes the name of the worksheet to read data from. You also need to specify the first and last cell to read data from, as well as if users are to be created or deleted. Example :
Input | Value |
Worksheet Name | Users |
Excel Start Cell | A1 |
Excel End Cell | G5 |
Mode | Create |
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
How to manage space membership (users)
With Microsoft and Google terminating support for Basic Authentication, Qlik suggests using a dedicated tool (such as Sendgrid, Mailchimp, or Mailgun) for bulk emails. While Qlik does not specifically endorse any of these services over the others, here's an example on how to setup Qlik Cloud Services SMTP settings with Sendgrid, taking advantage of their free tier. Going beyond that tier might require adopting a paying subscription from either Sendgrid or any of their competitors.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Implement advanced SMTP authentication options for alerts and subscriptions in Qlik Cloud
How to use Mailchimp bulk email service SMTP with Qlik Cloud Services and Qlik Application Automation
You have been running Qlik Sense normally for quite some time. Over time you have accumulated an abundance of reload tasks configured in your Qlik Sense QMC.
Recently however, you have noticed that Qlik Sense QMC reload tasks are in the following state:
To resolve the issue, it is recommended to add an additional scheduler node (or nodes) in order to manage the ever increasing number of reload tasks in the affected Qlik Sense environment.
As time goes on, add additional scheduler nodes in proportion to the increasing number of reload tasks added/deployed in your environment.
The existing Qlik Sense scheduler nodes simply cannot manage the additional burden placed upon it by ever increasing reload tasks.
Concurrent Reload Settings in Qlik Sense Enterprise
QB-20013
With Qlik Application Automation, you can get data out of Qlik Cloud and distributing it to different users in formatted Excel. The workflow can be automated by leveraging the connectors for Office 365, specifically Microsoft SharePoint and Microsoft Excel.
Here I share two example Qlik Application Automation workspaces that you can use and modify to suit your requirements.
Content:
Video:
Note - These instructions assume you have already created connections as required in Example 1.
This On-Demand Report Automation can be used across multiple apps and tables. Simply copy the extension object between apps & sheets, and update the Object ID (Measure 3) for each instance.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
To help Qlik customers manage costs more effectively, Qlik has developed the Qlik Snowflake Monitoring application, designed to provide invaluable insights about your Snowflake costs, usage, inventory, security, performance and contract utilization. This app utilizes Qlik's Associative Engine to connect directly to your Snowflake instance and reveal insights from Snowflake's detailed metadata, offering valuable information that traditional query-based tools and Snowflake's own reports are unable to provide.
Leveraging Qlik Application Automation, and Data Alerts, you can:
*Minor configuration is required on first run to create the required data connections.
Content:
This automation template is a fully guided installer/updater for the Qlik Snowflake Monitor. Leverage this automation template to easily install and update this application. The application itself is community-supported; and it is provided through Qlik’s Open-Source Software GitHub and thus is subject to Qlik’s open-source guidelines & policies.
For more information, refer to the GitHub Repository.
If the monitoring app was installed manually (i.e. not through the application automation installer), then the app will not be detected as existing. The automation will install new copies side-by-side. Any subsequent executions of the automation will detect the newly installed monitoring application and check their versions. This is because the application is tagged with ‘QCS - QSM - {App Name}’ and ‘QCS – QSM - {Version}’ during the installation process through the automation. Manually installed applications will not have these tags and therefore will not be detected.
The Qlik Snowflake Monitor requires two connections, one to your Snowflake instance to feed the data for your analytics, and one REST connection to the qlik-oss repository to run a version check on the monitor.
You will need to create a custom User, Role and Warehouse on your snowflake tenant. This is to ensure this user and role can see the monitoring details and can be monitored.
For Authentication, this setup is defaulted to username & password.
Finally, you need to name the connection as follows:
If you wish to use an alternative authentication method, please follow the documentation accordingly on both Snowflake & Qlik.
The REST connection is used to fetch version details from the GitHub repository. On reload it will look for the the latest released version in github and check this against the version you have installed. You can later use this in ‘Part Three’ to create an alert when updates to the application are available. To create a REST connection the following information is required:
Once these two connections have been set up, you can reload the application. The application has been created to accommodate Snowflake tenants of all sizes. If you have a small tenant, you will find the initial run of the load script can take around 30 minutes, and for larger tenants this can be over an hour or two. Subsequent runs will utilize cached QVDs that update daily to reduce reload times each subsequent day.
If a new release of the application is made, occasionally a full reload of data is required, but generally, if the data schema is unchanged the existing QVDs will be maintained. This is through the use of versions in the names of the QVDs used to store the data.
The application has the following two variables:
To create a new Data Alert for updates to the monitoring app, follow these steps:
The Qlik Snowflake Monitor can be easily installed by following these steps above. If you wish to find out more, check out this Ometis blog post and this Ometis Webinar to get a run through of the analytics this application can offer.
If you face any issues, please use the GitHub and raise an issue through the repository.
This article provides an overview of how to send straight table data to email as an HTML table using Qlik Application Automation.
The template is available on the template picker. You can find it by navigating to Add new -> New automation -> Search templates and searching for 'Send straight table data to email as table' in the search bar, and clicking the Use template option.
You will find a version of this automation attached to this article: "Send-straight-table-data-to-email -as-HTML-table.json".
Content:
The following steps describe how to build the demo automation:
An example output of the email sent:
The information in this article is provided as-is and will be used at your discretion. Depending on the tool(s) used, customization(s)andor other factors, ongoing support on the solution below may not be provided by Qlik Support.
How to export more than 100k cells using Get Straight Table Data Block
This article gives an overview of the measure distribution use case. It explains a basic example of a template configured for this scenario and additions for a more advanced use case.
For this use case, we will define the following keywords/expressions:
By using this approach, all you need to do is create/update your master items in your main app, and then push these updates to all your destination apps. This way, all destination apps have the same master items.
To support this use case, we created a basic template, which uses measures as master items.
By running this template, you will be able to distribute all the measures created in your main app to all the apps available in the destination space.
All you need to do is select your main app and your destination space.
Of course, this is just a basic implementation. This template can be upgraded to suit more advanced scenarios.
Let's go over a few examples:
The changes made by this automation won't be accessible immediately in other sessions (like the Qlik Sense UI) more info on that can be found here: Automation session delay. It can take up to 40 minutes for these changes to be visible in other sessions, if these changes are needed sooner in these sessions, the Save App block can be used. But keep in mind it can only be used once for every app that's changed by the automation. More information on the Save App block can be found here: How to use the Save App block.
For the above example, it's best to add an additional List Apps block that's configured exactly the same as the first one so, it returns the same apps. We'll add a Save App block in the loop of the new List Apps block and configure it to run for every app that's returned. This way, we make sure that the Save App block is executed only once for every app that was changed. See the image below for an example with the Save App block.
First part: includes an input block for the source/destination apps and for the measure tags.
Second part: includes a measure deletion flow, for a complete sync automation process.
Both these template examples are available as attachments.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
This article explains how to import and export master items to and from a Qlik Sense app using the Microsoft Excel connector in Qlik Application Automation.
Content:
The first part of this article will explain how to export all of your master items configured in your Qlik Sense App to a Microsoft Excel sheet. The second part will explain how to import those master items from the Microsoft Excel sheet back to a Qlik Sense App.
For this, you will need a Qlik Sense app in your tenant that contains measures, dimensions, and variables you want to export. You'll also need an empty Microsoft Excel file. The image below contains a basic example on exporting master items.
The following steps will guide you through recreating the above automation:
An export of the above automation can be found at the end of this article as Export master items to a Microsoft Excel sheet.json
For this example, you'll first need a Microsoft Excel file with sheets configured for each master item type (dimensions, measures, and variables). Use the above example to generate this file. The image below contains a basic example on importing master items from Microsoft Excel to a Qlik Sense app.
An export of the above automation can be found at the end of this article as Import master items from a Microsoft Excel Sheet.json
Follow the same steps to build automations that import/export dimensions and variables.
Let's go over some edge cases when exporting information to Microsoft Excel:
Please check the following articles for more information about working with master items in Qlik Application Automation and also uploading data to Microsoft Excel.
Follow the steps provided in this article How to import & export automations to import the automation from the shared JSON file.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
In my recent Getting SaaSsy with DataRobot post, I documented how to use the DataRobot Analytics Connector from within your Qlik Sense applications. I know this will sound crazy but what if you want to make predictions on data you aren't loading into your application? Maybe you are collecting input parameters in your application from end users to play what-if games. Maybe you will record the predictions but you also want to take immediate action based on their values (ie prescriptive analytics.) Well, those things sound like perfect use cases for Qlik ApplicationAutomation.
It gets better my friends. Whether you are using a dedicated on-premise DataRobot server, a dedicated tenant or you are on the leading edge path with DataRobot's shiny new AI Cloud Manager using Paxata, Qlik Application Automation has you covered, and so do I.
In this post, I will help you identify the right DataRobot Connector Block to use for your path, help you understand how to execute predictions, and help you understand what to do with the output from the predictions.
You have already chosen your DataRobot path, now it's just a matter of choosing the correct block from the DataRobot Connector. You probably would have guessed from the elaborate way I described the DataRobot choices which Qlik Application Automation block goes to which environment. But to be sure ... If you have a dedicated DataRobot server, or you have a dedicated tenant, you should use the List Prediction Explanation from Dedicated Prediction API block. If you are using the DataRobot AI Manager environment with Paxata, you should use the List Predictions block.
Oh no! What's that you are saying? You weren't told which path your organization chose, you were just given credentials and you just log in. Don't sweat it. I can help you with that. Just go to your Deployments within DataRobot, choose the Deployment you are going to execute predictions against, and choose Predictions, Prediction API and Real-time. DataRobot will provide all of the clues we need to choose the right block.
If the API URL contains App2.datarobot.com like in the first image below you are working with their AI Cloud and will need to use the List Predictions block. However, if you see a Dedicated path in your API_URL such as Qlik.orm.datarobot.com (second image) you will need to use the List Predictions fro Dedication Prediction API.
There are some other clues above as well. Notice in the second image the rest of the URL path contains api/v2/deployments, while the second image contains preApi/v1.0/deployments. It's basically DataRobot telling you which of their API's you need to utilize.
So how will that help you know for sure? One of the things that many people seldom look at with Qlik Application Automation blocks is the Description. Simply drag either/both of the blocks onto your canvas and scroll all the way down in the right panel. If you look at the List Predictions for Dedication Prediction API description you will see the following and notice it clearly indicates preApi/v1.0/deployments.
However, if you press the Show API endpoint link for the List Predictions block it will look like this. Both are dead giveaways as to the block/path you should choose.
Regardless of which block you are using, you will first need to create a Connection for Qlik Application Automation to your DataRobot environment. If you have a Dedicated server your Connection details will look like this. Notice that you will need to copy the api_key value right from the DataRobot Deployment Details:
Your DataRobot AI Cloud connection will look similar and again you would need to copy your api_key from the deployment details. My DataRobot AI Cloud is just a "trial", hence I chose that region, while my dedicated tenant (above) is the US. The biggest difference is in the domain. Dedicated connections will be app.datarobot.com and AI Cloud connections will be app2.datarobot.com:
Once you test/save the connections we are ready to start making predictions. The List Predictions block is the easiest to set up so we will start with that one. Simply click the drop-down in the Deployment Id field, press "Do Lookup" and then choose the specific deployment model you are going to be making predictions against.
Then you simply provide the Input data you want to pass to the deployment to have predictions made for. More about the Prediction Data later, but for now notice that I've simply hard-coded a JSON string with field/value pairs:
The List Predictions fro Dedication Prediction API requires us to do a few more things that need to be completed. The first is the Dedicated Prediction Url. Good thing I had you bring up your deployment details because we will just copy it. Notice you do not need the HTTPS:// or the /predApi.. text, just the actual URL information.
Next will again simply click the drop-down for the Deployment ID, click Do Lookup and then choose our desired deployment.
Next, we copy the Data Robot Key from our deployment details and then we can insert our JSON block. Again, more later about that so don't panic in thinking I'm suggesting that you hand code the values you want to predict. It's just to make this section easier to navigate. 😁
Qlik Application Automation provides 4 additional parameters that are part of the DataRobot API specification. You can define the Passthrough Columns, Passthrough Columns Set, turn on Prediction Warnings and set the Decimals Number format.
You can refer directly to the DataRobot API documentation for all of the details you wish. For instance, notice that I have the Prediction Warning Enabled set to "true." Getting warnings sounded like a good idea. But alas, I ended up with an error.
Well, it turns out that in order to utilize the Prediction Warning Enabled there is work that must be done on our Deployment within DataRobot.
I guess I could have saved myself the trouble had I read the documentation. Oh well, I simply changed my default back to false so that the prediction can run.
Above I simply demonstrated the JSON format you need for your Prediction Data with hard-coded values. I've used my DataRobots and have predicted with a 99.99999999% confidence level that your goal in reading this isn't to hardcode 50+ input values each time you want a prediction. Instead, that data will come from somewhere else. Which is perfectly ok. Maybe you will be pulling the values from some other system as part of a workflow. When A event triggers this Qlik Application you will go do B and C and then assign output from those things to Variables that you will use as the Prediction Data. That's a great plan ... simply choose your variables, and use them where you need them in your Prediction Data. Notice I have already assigned the MasterPatientID variable and am in process of choosing the race variable below.
I'm so sorry. You don't like to use values, and you weren't doing A, B and C you were actually firing a SQL Query live based on input to your workflow and you wanted to use the data from the SQL Query. That is brilliant. Pulling the live data, when whatever event you have chosen triggers the Automation. You should write some posts. No problem, Qlik Application Automation will absolutely allow you to do that.
Or perhaps you are using a writeback solution, like Inphinity Forms, within a Qlik Sense application to capture input parameters and you wish to use those values. Do that.
Or perhaps you are ... You get the point. The Prediction Data simply needs to be a JSON block containing the field/value pairs. How you construct it, or read it from an S3 bucket, or pull it out of thin air doesn't matter. Which is the beauty of working with DataRobot within Qlik Application Automation.
Woohoo, you now have a block that will execute a deployment in your DataRobot environment, regardless of which kind, and we are now ready for those wonderful predictions. Perhaps the first thing you noticed about the blocks List Predictions and List Predictions from Dedicate Prediction API was that they start with List as opposed to Get. It's of course because you may be passing a single row of data as Prediction Data or you could be passing many items in the JSON block. So these blocks are handled as lists, even if it is just a list of 1 prediction.
The DataRobot Connector for loading data into our applications simply returns the Prediction value, which is 0, in this case (the patient is not predicted to be readmitted.) However, notice below that within Qlik Application Automation either prediction block will return the Prediction as well, but it will also return a list of the Prediction values and the scores for each possible value. In my case, the 1, likely to be readmitted was scored at 0.0428973004, whereas the 0, the Prediction, was scored at .571026996.
Who cares?
Well, maybe you do. As I started this post I mentioned that perhaps we want to take action(s) based on the predictions which might be why you are making the prediction in your Qlik Application Automation workflow instead of just making the prediction in a Qlik Sense Application. If we are writing a flow that is "prescriptive" we might want to check the values. Ooooh .49999999999 vs .500000000001. Maybe that will be Action A, just email someone. While .000000001 vs .999999999 tells us that it's safe to go ahead and take the really expensive Action Z. So we might want to set up a Conditional expression.
Regardless of what we do with the values, Qlik Application Automation allows you to simply choose the values right from the block, just like it allowed you to choose Variables or data from another source.
If you don't already know me I will bring you up to speed quickly. I have very defined boundaries and am really particular about how things are worded. For example, take the phrase Data Science. Well, Science is explainable. Therefore, if something isn't explainable it isn't science. And if your predictions aren't explainable, then that isn't Data Science, it's just Data. One of the key reasons you are likely using DataRobot is the fact that it can so wonderfully return explanations for its predictions.
The Prediction of 0 above is nice. But knowing what factors led to the prediction may be just as valuable when helping us choose our prescriptive actions. Well, my friends, Qlik Application Automation has you covered for that scenario as well. In fact, you can see from the following image the block is literally raising its hand and begging you to choose it. List Prediction Explanation from Dedicated Prediction API will give you not only the Prediction, and the Prediction values but it will also return the explanations to you.
Wait something must be wrong. I see a qualitativeStrength of +++, but the second is --. What do those mean? Oh yeah, now I remember ... Qlik Application Automation is just calling the provided DataRobot API's so I might as well check the documentation from DataRobot so that I get a full and complete understanding of the input paramters I can choose for the block and understand the output values. Sure enough, it's covered.
https://docs.datarobot.com/en/docs/api/reference/predapi/pred-ref/dep-predex.html
I see you out there on the leading edge doing Time Series Predictions in DataRobot. Not an issue, Qlik Application Automation has you covered with a block as well. Simply choose the List Time Series Predictions from Dedicated Prediction API and you will be good to go.
The initial inputs needed are already covered above. However, there are a few additional parameters you will need to input as well.
Of course, DataRobot has you covered with complete documentation at https://docs.datarobot.com/en/docs/api/reference/predapi/pred-ref/time-pred.html
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Getting SaaSsy with Data Robot
Lions and Tigers and Reading and Writing Oh My
This article explains how the Amazon SNS connector in Qlik Application Automations can be used to set up webhooks that trigger when a object creation event occurs in Amazon S3. This connector only has webhooks available.
Content:
Search for the "Amazon SNS" connector in Qlik Application Automations. When you click connect, you will be prompted for the following input parameters:
You must obtain the AWS Access Key from IAM in your AWS console. This can be obtained by going to the IAM section in AWS and in the left side panel choose users.
Here you can choose either for an already existing user or creating a new one by clicking the "Add Users" button in the topright.
When you create a new user, you must provide a user name and click next. You do not need to give this user access to the AWS console. In the next step, you will give permissions to do this IAM user.
The following policy needs to be created and attached to the IAM user, replace the account-id with your account ID:
Other permissions that are suggested to add are:
Furthermore, the IAM user must be made an owner of a S3 bucket when creating a notification configuration.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutBucketNotification",
"iam:PassRole",
"sns:Publish",
"sns:CreateTopic",
"sns:Subscribe"
],
"Resource": [
"arn:aws:s3:::*",
"arn:aws:iam::account-id:role/*",
"arn:aws:sns:*:account-id:*"
]
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": "sns:Unsubscribe",
"Resource": "*"
}
]
}
You will have to create an access key for the IAM user.
This can be done in the (a) Users menu and in the (b) Security credentials tab. Click (c) Create access key.
Choose Third-party service and choose to understand the above recommendation, click next:
You will now have your access key and secret key and can finish creating the datasource in Qlik Application Automation:
You can use this in an automation, but only as a webhook. When you create a new automation, you will be presented with a blank canvas. Select the Start block and change the run mode to webhook.
Choose an event type next. These are currently limited to S3 object creation events. You will have lookup capabilities available to other parameters, such as bucket and topic selection:
After saving the automation, you can test the webhook by uploading objects in your S3 bucket and see in your automation run history that it is triggering this automation.
The use of this is that you can now trigger tasks after a object is uploaded to S3. Common tasks will be to reload a Qlik Sense app or trigger a data pipeline in any of our other connectors:
This article describes how to resolve the NPrinting connection verification error:
x Qlik NPrinting webrenderer can reach Qlik Sense hub error
Installing, upgrading, and managing the Qlik Cloud Monitoring Apps has just gotten a whole lot easier! With two new Qlik Application Automation templates coupled with Qlik Data Alerts, you can now:
The above allows you to deploy the monitoring apps to your tenant with a hands-off approach. Dive into the individual components below.
Some monitoring apps are designed for specific Qlik Cloud subscription types. Refer to the compatibility matrix within the Qlik Cloud Monitoring Apps repository.
Content:
This automation template is a fully guided installer/updater for the Qlik Cloud Monitoring Applications, including but not limited to the App Analyzer, Entitlement Analyzer, Reload Analyzer, and Access Evaluator applications. Leverage this automation template to quickly and easily install and update these or a subset of these applications with all their dependencies. The applications themselves are community-supported; and, they are provided through Qlik's Open-Source Software GitHub and thus are subject to Qlik's open-source guidelines and policies.
For more information, refer to the GitHub repository.
Note that if the monitoring applications have been installed manually (i.e., not through this automation) then they will not be detected as existing. The automation will install new copies side-by-side. Any subsequent executions of the automation will detect the newly installed monitoring applications and check their versions, etc. This is due to the fact that the applications are tagged with "QCMA - {appName}" and "QCMA - {version}" during the installation process through the automation. Manually installed applications will not have these tags and therefore will not be detected.
This template is intended to be used alongside the Qlik Cloud Monitoring Apps for user-based subscriptions template. This automation provides the ability to keep the API key and associated data connection used for the Qlik Cloud Monitoring Apps up to date on a scheduled basis. Simply input the space Id where the monitoring_apps_REST data connection should reside, and the automation will recreate both the API key and data connection regularly. Ensure that the cadence of the automation’s schedule is less than the expiry of the API key.
Enter in the Id of the space where the monitoring_apps_REST data connection should reside.
Ensure that this automation is run off-hours from your scheduled monitoring application reloads so it does not disrupt the reload process.
Each Qlik Cloud Monitoring App has the following two variables:
With these variables, we can create a new Qlik Data Alert on a per-app basis. For each monitoring app that you want to be notified on if it falls out of date:
Here is an example of an alert received for the App Analyzer, showing that at this point in time, the latest version of the application is 5.1.3 and that the app is out of date:
Q: Can I re-run the installer to check if any of the monitoring applications are able to be upgraded to a later version?
A: Yes. Run the installer, select which applications should be checked and select the space that they reside in. If any of the selected applications are not installed or are upgradeable, a prompt will appear to continue to install/upgrade for the relevant applications.
Q: What if multiple people install monitoring applications in different spaces?
A: The template scopes the applications install process to a “target” space, i.e., a shared space (if not published) or a managed space. It will scope the API key name to `QCMA – {spaceId}` of that target space. This allows the template to install/update the monitoring applications across spaces and across users. If one user installs an application to “Space A” and then another user installs a different monitoring application to “Space A”, the template will see that a data connection and associated API key (in this case from another user) exists for that space already and it will install the application leveraging those pre-existing assets.
Q: What if a new monitoring application is released? Will the template provide the ability to install that application as well?
A: Yes. The template receives the list of applications dynamically from GitHub. If a new monitoring application is released, it will become available immediately through the template.
Q: I would like to be notified whenever a new version of a monitoring applications is released. Can this template do that?
A: As per the article above, the automation templates are not responsible for notifications of whether the applications are out of date. This is achieved using Qlik Alerting on a per-application basis as described in Part 3.
Q:I have updated my application, but I noticed that it did not preserve the history. Why is that?
A: The history is preserved in the prior versions of the application’s QVDs so the data is never deleted and can be loaded into the older version. Each upgrade will generate a new set of QVDs as the data models for the applications sometimes change due to bug fixes, updates, new features, etc. If you want to preserve the history when updating, the application can be upgraded with the “Publish side-by-side” method so that the older version of the application will remain as an archival application. However note that the Qlik Alert (from Part 3) will need to be recreated and any community content that was created on the older application will not be transferred to the new application.
It is possible to export the list of tenatnt users to a .json file using the "user ls" command from the Qlik Command Line Interface (qlik-cli).
The scripts provided in this article are provided as they are and they are for guidance only.
As a tenant admin, download and configure the Qlik-cli
qlik user ls --limit 1000 > tenantusers.json
[
{
"assignedGroups": [],
"assignedRoles": [
{
"id": "608050f7634644db3678b1a2",
"level": "user",
"name": "Developer",
"type": "default"
},
{
"id": "608050f7634644db3678b17f",
"level": "admin",
"name": "TenantAdmin",
"type": "default"
},
{
"id": "605a1c2151382ffc836af862",
"level": "user",
"name": "SharedSpaceCreator",
"type": "default"
},
{
"id": "605a1c2151382ffc836af866",
"level": "user",
"name": "ManagedSpaceCreator",
"type": "default"
},
{
"id": "605a1c2151382ffc836af86b",
"level": "user",
"name": "DataSpaceCreator",
"type": "default"
},
{
"id": "605a1c2151382ffc836af85d",
"level": "admin",
"name": "AnalyticsAdmin",
"type": "default"
},
{
"id": "605a1c2151382ffc836af85f",
"level": "admin",
"name": "DataAdmin",
"type": "default"
},
{
"id": "63580b8d5cf9728f19217be0",
"level": "user",
"name": "PrivateAnalyticsContentCreator",
"type": "default"
},
{
"id": "6356f0425cf9728f1962b942",
"level": "user",
"name": "DataServicesContributor",
"type": "default"
}
],
"created": "2020-05-18T09:38:29.214Z",
"createdAt": "2020-05-18T09:38:29.214Z",
"email": "martina.testoni@dkdaklaldkdaklladaaddddl.com",
"id": "USERID1",
"lastUpdated": "2023-04-04T07:32:00.756Z",
"lastUpdatedAt": "2023-04-04T07:32:00.756Z",
"name": "Martina Testoni",
"picture": "https://s.gravatar.com/avatar/gravatarimage=pg\u0026d=https%3A%2F%2Fcdn.auth0.com%2Favatars%2Fdp.png",
"preferredLocale": "",
"preferredZoneinfo": "Europe/Copenhagen",
"roles": [
"Developer",
"TenantAdmin",
"SharedSpaceCreator",
"ManagedSpaceCreator",
"DataSpaceCreator",
"AnalyticsAdmin",
"DataAdmin",
"PrivateAnalyticsContentCreator",
"DataServicesContributor"
],
"status": "active",
"subject": "auth0|SUBJECTID2",
"tenantId": "TENANTID"
},
{
"assignedGroups": [],
"assignedRoles": [
{
"id": "608050f7634644db3678b17f",
"level": "admin",
"name": "TenantAdmin",
"type": "default"
},
{
"id": "605a1c2151382ffc836af86b",
"level": "user",
"name": "DataSpaceCreator",
"type": "default"
},
{
"id": "608050f7634644db3678b1a2",
"level": "user",
"name": "Developer",
"type": "default"
},
{
"id": "605a1c2151382ffc836af866",
"level": "user",
"name": "ManagedSpaceCreator",
"type": "default"
},
{
"id": "63580b8d5cf9728f19217be0",
"level": "user",
"name": "PrivateAnalyticsContentCreator",
"type": "default"
},
{
"id": "605a1c2151382ffc836af862",
"level": "user",
"name": "SharedSpaceCreator",
"type": "default"
},
{
"id": "6356f0425cf9728f1962b95c",
"level": "user",
"name": "Steward",
"type": "default"
},
{
"id": "605a1c2151382ffc836af85d",
"level": "admin",
"name": "AnalyticsAdmin",
"type": "default"
},
{
"id": "62bb165356d1879582c1b468",
"level": "admin",
"name": "AuditAdmin",
"type": "default"
},
{
"id": "605a1c2151382ffc836af85f",
"level": "admin",
"name": "DataAdmin",
"type": "default"
}
],
"created": "2023-03-31T08:44:37.332Z",
"createdAt": "2023-03-31T08:44:37.332Z",
"email": "Gentile.Faccenda@dkdaklaldkdaklladaaddddl.com",
"id": "USERID2",
"lastUpdated": "2023-04-03T11:24:35.037Z",
"lastUpdatedAt": "2023-04-03T11:24:35.037Z",
"name": "Gentile Faccenda",
"picture": "https://s.gravatar.com/avatar/randomurl=https%3A%2F%2Fcdn.auth0.com%2Favatars%2Fdp.png",
"roles": [
"TenantAdmin",
"DataSpaceCreator",
"Developer",
"ManagedSpaceCreator",
"PrivateAnalyticsContentCreator",
"SharedSpaceCreator",
"Steward",
"AnalyticsAdmin",
"AuditAdmin",
"DataAdmin"
],
"status": "active",
"subject": "auth0|IDPSUBJECT2",
"tenantId": "TENANTID"
}
]
qlik user ls --limit 1000 | ConvertFrom-Json | ConvertTo-Csv > tenantusers.csv
Note: we've recently (May 25th) released a new version of the Snowflake connector, if you had automations using Snowflake prior to that date, the connector will show as Snowflake - deprecated. To use this new version, simply replace those blocks with blocks from the current Snowflake connector.
This article gives an overview of the available blocks in the Snowflake connector in Qlik Application Automation. It will also go over some basic examples of retrieving data from a Snowflake database and creating a record in a database.
This connector has the following blocks:
To create a new connection to Snowflake, the following parameters are required:
The Do Query block can be used to perform actions in Snowflake that aren't supported by the other blocks. See the below example on creating a new table.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
This article explains how a list can be used as values for the Add Selection To Report and Add Selection To Sheet blocks in the Qlik Reporting connector in Qlik Application Automation.
You might have noticed that the Values input field in these blocks only allows you to specify values one by one. But in some scenarios, you'll want to specify a list of field values instead of adding them one by one.
If you're new to reporting, please read our Reporting tutorial first.
The source of these values can either be the List Values Of Field block from the Qlik Cloud Services connector or any List ... block from a 3rd party storage tool like Microsoft Excel. In this example, we'll use the List Values Of Field block.
The example automation used in this article looks like this:
And this is what the example output of the List Values Of Field block looks like:
In this case, the qText parameter is required as the value to make selections. Go to the Add Selection To Report block, make sure to specify the same field name as the one used in the List Values Of Field block, and enable the "Raw input" mode:
Remove the square brackets from the input field and click it to select the "Output from List Values Of Field" as the input for the Values input field:
This will take you to the output of the List Values Of Field block and then click the qText parameter. And choose "Select all qText(s) from list ListValuesOfField" in the next screen.
That's it! When the automation now runs, a list of strings is mapped as the value for this selection. You can verify this by toggling the view mode in the automation's chronological output view:
If you want to use multiple selections for this report, add additional Add Selection To Report or Add Selection To Sheet blocks.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Currently, in Qlik Application Automation it is not possible to export more than 100,000 cells using the Get Straight Table Data block.
Content:
To overcome this limit, the workaround is to export records in batches from the Qlik Sense straight table to the cloud storage platform of your choice. The prerequisite is to have a unique numerical field in your dataset. If you don't have the unique field in your dataset, you can add it using RowNo() function in the load script as shown below. This counts the rows in the dataset.
In this example, we will export data from the Qlik Sense straight table to Dropbox as a CSV file.
You can also find an exported version of this automation and application attached to this article. More information on importing automation can be found here.
Automation Part 1
Automation Part 2
Automation Part 3