Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
This group hosts information related to the Qlik Deployment Framework (QDF). Providing best practices, libraries and utilities that facilitate the recommended setup and management of QlikView or Qlik Sense environments. An understanding of the Qlik platform is recommended before joining this group. Download QDF and documentation here: "https://github.com/QlikDeploymentFramework/Qlik-Deployment-Framework/releases"
Hello everyone,
I have a question for you: is that possible to create containers or use QDF scripts stored in cloud drives like Google Cloud Storage, Dropbox etc.?
Thanks a lot!
Brunello
Introduction of Artificial Intelligence
Sometimes, I want I might return to 1955 and forestall John McCarthy from occupation it “Artificial Intelligence.” It’s a term that, relying upon wherever you're employed, you can’t go five minutes while not hearing once or doubly — that is nice. It’s nice that folks are trying to the ‘future.’ It’s nice that society is pushing forward with growth and growth and everyone that heat and fuzzy stuff. sadly, AI doesn’t very do justice to what it’s describing. read this full from here https://www.edu-right.com/2019/08/what-is-application-of-artificial-intelligence.html
I’m curious, how do you store your subs?
I’m torn between storing them all in one script, storing them collected into library/topic scripts, or one script per sub.
The standard QDF subs are stored with each sub in a separate script.
What do you prefer, and why?
Qlik Deployment Framework for Cloud (alpha)
These scripts are specific for Qlik Cloud, meaning that you can not run Qlik Sense desktop, QlikView or Qlik Sense server together with these scripts. Qlik Cloud spaces can be used as containers as well as classic containers stored under external storage that is mapped as a drive in Qlik Cloud.
This is an early alfa with several bugs and limitations in Qlik Cloud. Use standard QDF containers and replace the scripts with these. All QDF functions are loaded in during initiation but not all of functions are tested to work with Qlik Cloud.
SET vG.HomeContainer='lib://user:OneDrive - user/QDF_SaaS/Shared';
$(Include=$(vG.HomeContainer)\InitLink.qvs);
vG.SharedBaseVariablePath/ContainerMap.csv
vG.BasePath
and vG.QVDPath
are generated and used.
Prefix
and ContainerName
under vG.SharedBaseVariablePath/ContainerMap.csv
(no AltPath
are specified in container map)$(include=$(vG.HomeContainer)/InitLinkSkip.qvs);
to skip executing the initiation code that identifies related containers, this only works when initiation has executed successfully one time in the same locationLink to git Qlik Deployment Framework for Cloud (alpha)
Hi, Im so sorry for taking long time. My work and family has made this project to suffer, sorry for this. I have uploaded a QDF for cloud Alfa version here https://github.com/QlikDeploymentFramework/Qlik-Deployment-Framework-Cloud
There are several thinks to take in consideration:
This is an early alfa with several bugs and limitations in Qlik Cloud.
* Shared container is a must and need to be mapped within the reload script, as seen below:
SET vG.HomeContainer='lib://user:OneDrive - mail/QDF_SaaS/Shared';
$(Include=$(vG.HomeContainer)\InitLink.qvs);
* In this release all container folders in external drives (folders you want to QDF to identify as Global Variable path) need to include the Info.txt file, as this file identifies the folder as a Global path
* external drives are really picky on trailing slash, for OneDrive no trailing slash is possible
* To identify containers stored in external drives, the base URL need to be specified in vG.SharedBaseVariablePath/ContainerMap.csv
* To identify spaces as containers just add the space name under vG.SharedBaseVariablePath/ContainerMap.csv
* Qlik Cloud spaces has limited support for subfolders, so adding QDF containers to spaces (by same name) only vG.BasePath and vG.QVDPath will work
* To identify spaces as containers just add the space name under vG.SharedBaseVariablePath/ContainerMap.csv* Qlik Cloud spaces hav limited support subfolders, so adding QDF containers to spaces (by same name) only vG.BasePath and vG.QVDPath will work
The next generation BI tools have changed the IT landscape of enterprises with features such as self-service reporting, ease of development and deployment, cloud and big data support. With these rich features and initiatives to modernize IT, there has been surge in adoption and migration to tools such as Qlik and Tableau. The enterprise tools are now focused on automation and reducing IT development and maintenance overheads. And QA, being an integral and critical part of development process, must have clear automation approach supported by tools. The white paper describes a BI QA Automation framework.
Enterprises use BI tools such as SAP BusinessObjects, SAP Crystal Reports, OBIEE, Qlik and Cognos for catering their reporting needs. One of the key factors for successful BI project implementation is the level of trust on the data depicted in reports and dashboards. Lack of trust in data causes attrition in user adoption and often results in project failure.
BI testing is the process of validating the data, format and performance of the reports, subject areas and security aspects of the governing project. Emphasis on a thorough BI Testing is key for improving the quality of the reports and user adoption. However, testing of the BI projects is different from traditional web application testing since the contents of the report is automatically generated by the BI tool based on the tool metadata. The focus of this article is to propose an automated approach to test the report data and its utilization would be a report migration project.
Need for Report QA Automation
Report Data Testing is an important process of assessing the reliability of information visualized on the report. It is required to sustain the trust of the user on the reports because reports are often used as key basis for management decisions and the user assumes that the information on the report is representative of source data.
A typical approach to verify report quality is to verify calculations on the reports themselves to the data in the source report or data warehouse. Thus the BI testing process becomes a manual one, which increases the workload for testers and making it prone to mistakes. Most BI tools manipulate/transform the source data, which makes it even harder to manually test. Manual testing also is a slow and cumbersome approach which compromises the accuracy of the testing.
It needs to be ensured that report data exactly matches with the expected output, as the reports act as a basis to make critical decisions. Having a manual QA process to validate reports has multiple downsides:
QA Automation Solution Design
As described in the Solution Architecture below the QA automation is done in following steps:
Fig 1: QA Automation Solution Architecture
Generated Code Template
QA Automation Report
Fig 3: Report Output
The above figure is the snapshot of the test engine output where the “Comparison Summary” provides the count of Total mismatched data points and mismatch count by Measure. Also depicts count of data points which are present in either source or target.
The Detailed view highlights the mismatched data points in RED background. The column of interest here is “Table” from which one can identify to which source this data row belongs to i.e. in our case OLTP data means transaction data from SAP Crystal Reports and OLAP means Dimensional data from Qlik.
Benefit of ’s BI QA Automation Approach
As per the Analyst Firm Gartner- “The average organization loses $14.4 million dollars annually through poor data quality.”
Conclusion
Automated BI report Data Quality Analysis approach is a full-proof automated solution to the challenges faced by the Quality Analysts while performing data matches in the environments like report migration from one BI tool to another. It has multiple file format support feature and based on the pre requisite that the data is freeze at same point for both the reporting tools. The observation is that it reduces the testing effort by around 80 percent with reduced number of resources and lesser technically skilled. So it ensures that the trust on the reports is enhanced by reducing the efforts.
Current Version 3.20 (16-Oct-2015)
Now supports Partial Reload, runs faster with large data sets
and puts Num() function only on the outer KPI if KPIs are nested.
Download: Dropbox - KPI_Repository.zip
I recommend to name KPI variables starting with "_". Then you have a nice way to use type ahead formula completion in Sense and QlikView, as all your KPIs sort in a dropdown list next to the "_".
As Presales, you have probably done it a hundred times: Add lots of KPIs formulas and also try to keep a maintainable format with a "define once" principle. Formulas and Sets should be separated in their definitions and mixed and reused as needed.
This solution allows you to
It is common knowledge that
But wait a minute! There are two traps, which are definately NOT common knowledge.
So forget this manual approach and use my include script in one of the below ways: minimalistic flat file, maximum flat file, or database mode.
Any textformat is supported. You have to specify the arguments for the text format.
Alternatively, use this in an Excel sheet. Same thing, easier to edit, but Excel is not always available during a SiB
SET vKPI_Source = 'File';
SET vKPI_File =
SET vKPI_File_Params = [txt, codepage is 1252, embedded labels, delimiter is '\t', msq];
SET vKPI_KeepTable = 1;
$(must_include=
Create a folder connection to where the include-script.txt is placed and a folder connection (if different) where the KPI definition file is placed. All formats of the file are supported but you have to provide the file import parameters below, so it could be txt, biff, xlsx …
SET vKPI_Source = 'File';
SET vKPI_File = [lib://include/KPI_Def.txt]; // Qlik Sense
SET vKPI_File_Params = [txt, codepage is 1252, embedded labels, delimiter is '\t', msq];
SET vKPI_KeepTable = 1;
$(must_include=[Lib://include/Create_KPI_Repository-include.txt]);
It doesn't matter which database you used. Attached is an MS Access example. Could be any other relational database.You will need 4 tables:
You know, sequence matters, so it is key to understand when you can already refer to a "previously" defined Formula.The sequence when using a database is like this
Examples for each table
2 | MGMT | *Management* |
3 | MGMT | *Dashboard* |
4 | TEST | *test* |
5 | TEST | *try* |
APPGROUP | APP_PATTERNNAME |
---|
10 | * | CY | Year={$(=Max(Year))} | ||
20 | MGMT | LH | CARRIER_ALN_CD_LH_Flag = {1} | ||
30 | * | PY | Year={$(=Max(Year)-1)} | ||
USAGE | NAME | FORMULA | DONTRESOLVE | COMMENT |
---|
10 | * | _PAX_NR | Sum ({$} PAX_NR) | 0.000 | , | . | ||
20 | * | _Destinations | Count(DISTINCT {< $(LH) >} OTHER_ARP_CD) | |||||
30 | * | _%PAX | Sum(PAX_NR_DETAILS)/ SUM (TOTAL PAX_NR_DETAILS) | ##0 % | ||||
40 | MGMT | _Margin 1 | sum(Margin_ONB) | |||||
50 | * | _Margin 1 per PAX | $(_Margin 1) / $(_PAX_NR.LH) | 1 | ||||
USAGE | NAME | FORMULA | DONTRESOLVE | NUMFORMAT | DECIMALSEP | THOUSANDSEP | COMMENT |
---|
10 | * | _PAX_NR.CY | _PAX_NR | {$} | {< $(CY) >} | No | ||||
20 | * | _PAX_NR.PY | _PAX_NR | {$} | {< $(PY) >} | No | ||||
30 | * | _PAX_NR.LH | _PAX_NR | {$} | {< $(LH) >} | No | ||||
USAGE | NAME | VARIANTOF | SEARCH | REPLACE | DONTRESOLVE | NUMFORMAT | DECIMALSEP | THOUSANDSEP | COMMENT |
---|
Note that KPI_Variants has no FORMULA column, as the formula is inherited from the KPI defined under VARIANTOF.
//Put your database connector here
OLEDB CONNECT TO [Provider=Microsoft.ACE.OLEDB.12.0;User ID=Admin;Data Source=.............];
SET vKPI_Source = Database;
LET vKPI_SQL_AppGroups = 'AppGroups';
LET vKPI_SQL_Set_Definitions = 'Set_Definitions';
LET vKPI_SQL_KPI_Definitions = 'KPI_Definitions';
LET vKPI_SQL_KPI_Variants = 'KPI_Variants';
$(must_include=
Create a folder connection to where the include script is placed. In below example that connection is called “include” Create the database connection to where the definition table Is located. Connect to it, then provide the SQL command in the variable vKPI_Select before calling the include script.
LIB CONNECT TO 'kpi_definitions_db';
SET vKPI_Source = Database;
LET vKPI_SQL_AppGroups = 'AppGroups';
LET vKPI_SQL_Set_Definitions = 'Set_Definitions';
LET vKPI_SQL_KPI_Definitions = 'KPI_Definitions';
LET vKPI_SQL_KPI_Variants = 'KPI_Variants';
$(include=[Lib://include/Create_KPI_Repository-include.txt]);
Provide columns in UPPER CASE. The sequence of the columns doesn't matter. You don't have to provide all columns, the 3 mandatory columns are enough. Any other combination is allowed.
USAGE (mandatory)work in progressNote: the script will not “find” apps with a matching title anywhere and inject the KPI definitions there. You need to add the load script yourself in any of the applicable apps.Be aware that a formerly created variable is not removed if later a wildcard expression is changed and is no longer applicable for a given app. The variable remains with its last loaded state unless removed in the frontend.
NAME (mandatory)Variable name under which the formula is stored. Brackets and single quotes are not allowed in the KPI_Name, the name is case-sensitive.Recommended characters: A-Z, a-z, 0-9, @, $, _ and space. Use a distict prefix for sets (like set_ or $) and another prefix for KPI formulas
FORMULA (mandatory)Put the correct formula or other definitions here.Altough you can assign all type of content into variables here, I see two typical ways of usage: Aggregation Formulas (aka "KPI") and Set Modifiers
VARIANTOF (optional)work in progress
SEARCH (optional)work in progress
REPLACE (optional)work in progress
DONTRESOLVE (optional)A flag that defines whether a formula with a dollar-brackets $(reference) is resolved (=replaced by it’s referred content) immediately during Script execution or if the reference (the dollar-brackets fragment) is kept as part of the formula itself.
NUMFORMAT (optional)Optionally, put a valid format-string here. That will cause the above formula to be wrapped into a NUM( … ) formula, where columns 6 and 7 are used as additional parameters for the NUM() formula.
DECIMALSEP(optional)If column 5 is used, this may optionally define the decimal separator (one character only).If NumFormat is not used, this column is ignored
THOUSANDSEP(optional)If column 5 and 6 are used, this may optionally define the thousand separator (one character only).If DecimalSep is not used, this column is ignored. It only works in combination with columns 5 + 6
If you would like to understand why defining KPIs in variables directly in the script will be problematic then try this in your Load Script
So we conclude as a general rule
Hi
We currently have Qlikview May 2021 (SR2) and QDF 1.7.1. Everything is working correctly.
We are now planning to upgrade to latest version of Qlikview (May 2022 SR1) and im wondering if we would also need to upgrade QDF?
Regards Mats
Hi,
We had received an input from CIS team in our organization to add cross site scripting vulnerability in Qliksense.
We also need to implement Content security policy?
Please find below snapshot of virtual proxies Additional response header:
Please let me know if i need to add any other parameters?
How do i implement content security policy?
Regards,
Rohit Gharat
Hi,
Anyone with experience of the migration of the Repository DB from self-hosting to GCP Postgres?
Any info or guidelines are appreciated,
Thanks in advance,
br
Paul
Hello
we've been using QDF for long now
but still couldn't resolve the issue of debugging a script in the QVF file
if I try to debug a script then the debugger would delve into all script files used by QDF and this takes ages
any idea of how be able to debug an app without having the debugger cursor go to all script files
We have sporadically the effect, that the user can enter the hub and sees all the correct apps, but whenever he tries to open an app the Qlik loading bubbles appear and stay no matter how long he waits.
To make things worse, this bug can't be detected in the QMC, so we only notice it's occurrence when the users complain that Qlik isn't working.
This has happend on our Single-Server System as well as on on our 4-Server System with one central, two Reload and one Display Node.
The only way to remove the Bug ist to reboot the Display-Node-Server or on the Single-Server-System the whole system.
We are currently trying to find a html-query to detect when the error occurs to initiate an automatic Server reboot. While this would help us reduce the downtime for the Users, it still doesn't solve the actual Problem.
Any tips or ideas would be highly appreciated.
Hi All,
I have a general deployment question for the experts. We are currently looking into a multi-cloud setup for a customer how wants to use the Qlik SAAS environment for it's data viz app build capabilities as also for the data application to be used by viewers/consumers.
He would like to keep his data flow on his QSEoW environment (data loading from diff. systems and storing into QVD's) and store his datamarts to the cloud. Next step, he would setup a binary load from an app that he stores in a shared space on the Qlik SAAS environment and make the necessary front-end changes in Qlik SAAS. He would than publish the app to a managed space so that the end-users can consume the changes.
We investigated the usage of Qlik Data Transfer but this is kinda instable in my opinion and when it's stuck, you don't see it in any dataflow or task process on the QSEoW QMC so for administrator purposes, this is a no-go for a production deployment framework (in this case)
We went for the approach of using Qlik CLI for SAAS in combination with Qlik CLI for windows and implemented a trigger in the datamart scripts for triggering a powershell file to deploy an (export from QSEoW and import in Qlik SAAS) app to saas. BUT 🙂 the qlik app import functionality no longer support AUTOREPLACE mode. so everytime we now import an app, a new app-id is created. which break the binary load in the SAAS environment because this is setup to load from a specific app-id.
I'm kind off stuck in my setup on how to do this for a customer that has a multi-cloud license/setup and want the above approach.
How would you guys approach this request/deployment setup.
Thanks in advance,
Timmy
hello
I got a container in which I have QVS files in both the Config and Include folders
I see that I can call the sub routine in the QVS file in the config folder unless I include the QVS using $(Must_Include...
is this the only way or I'm missing something
I mean if the folder is meant to be a configuration folder why should I include it in the script instead of just calling the sub-routine
kindly advise
Hi,
We were using custom folder names in Qlikview for QVDs(Raw,Trandformed,Custom) and we want to move to QDF deploy tool for Qlik Sense, but it seems that whenever you deploy a new container, the tool copies the init.qvs to all container with a version most probably embedded in the exe. I tried changing the inline in the init.qvs from the template container, but the tool is copying another version.
Any ideas if this copy can be skipped or changed to copy from the template container?
Thanks,
Bogdan
hello guys, what's up,
I need help with how I can compile a new “QlikDeploymentFramework.exe” because I made a change to the standard qvs “12.Index.qvs” and every time I create a container, “QlikDeploymentFramework.exe” undoes my changes in all existing containers.
This also occurs in my “.gitignore” file because I do not enable the “Instals Qvc Lib” option and with that the “.gitignore” is also mortified for what is compiled inside my “QlikDeploymentFramework.exe” version 1.7.4, this mess my whole environment.
So I would like to make changes to these two files and complicate a new “QlikDeploymentFramework.exe”, how is that possible?
Thank you in advance.
Hello we all know that we can use the function LCGV to connect to a container and generate variables for the path of the desired folder...
now we've been using QDF at my company for almost 2 years and the thing is that we cannot know which container connects to which container
is there a way to visualize or to generate kind of dependency list between containers?
things are becoming cumbersome and like spider web if I can say
kindly advise
@Michael_Tarallo
@Magnus_Berg
@Anonymous
I have two QlikView Publisher server with a production and development QDF Framework Root. I was hoping it is possible to create a Qlik Sense node that has apps and a single lib mount to the development QDF Root and a separate Qlik Sense Server that had a single lib mount to the production QDF Root. is it possible to segregate nodes for reloading apps?