Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I am creating a OAuth client in the management console per these instructions: https://qlik.dev/authenticate/oauth/create/create-oauth-client/.
However when I change the consent method and publish the client, the consent method flips back to "required" from "trusted". Any idea why?
I’m trying to build an automation in Qlik that sends an Excel file generated from a tabular report in an app that uses Section Access. The goal is for each user (around 100) defined in Section Access to receive an email with their individual data only (based on their access level).
I know that in-app reporting can handle personalized reports, and I’ve looked into using the Create Report, Add Sheet to Report, and Generate Report blocks in Qlik Application Automation. However, the Generate Report block only supports PDF and PPT, and I specifically need Excel output.
Additionally, I need to include CC recipients in the email for each user.
My questions:
Can the automation loop through Section Access users and apply their data reduction to the attach file?
How can I dynamically add CC recipients for each email?
Any guidance or best practices would be greatly appreciated!
Hi all,
I'm having issues surrounding ConvertToLocalTime and the UK's DST change.
For clarity, the change happened on 26th October 2025.
Our setup is QlikSense (Cloud) to Dynamics365 Business Central via Odata v4. I am trying to build some of these queries as incremental loads (Dump, save to disk & on the next run, pull anything >= last CreatedAt time)
I'm running this code, which isn't overly complex, to build an Exclusion List.
LIB CONNECT TO 'BC Testing - Odata:BC Odata Connection';
RestConnectorMasterTable:
SQL SELECT
"__KEY_root",
(SELECT
"Document_No",
"SystemCreatedAt",
"__FK_value"
FROM "value" FK "__FK_value")
FROM JSON (wrap on) "root" PK "__KEY_root"
WITH CONNECTION(
URL "$(vBaseURL)/$(vTable)?$filter=ManualDeleteReason eq 'Incorrect Order'",
HTTPHEADER "Authorization" "Bearer $(vAccessToken)");
_ExclusionList:
LOAD
distinct [Document_No] as ExcludedOrders,
SystemCreatedAt as SysCreatedOrig,
date(Timestamp(
ConvertToLocalTime(
Timestamp#([SystemCreatedAt], 'YYYY-MM-DDTHH:mm:ss.fffZ'),
'Europe/London')),'DD/MM/YYYY hh:mm') as SystemCreatedAt
RESIDENT RestConnectorMasterTable
WHERE NOT IsNull([__FK_value])
order by SystemCreatedAt asc
;
and anything pre DST is converted, anything post, isn't.
I tried using ALT, but that made no difference.
LIB CONNECT TO 'BC Testing - Odata:BC Odata Connection';
RestConnectorMasterTable:
SQL SELECT
"__KEY_root",
(SELECT
"Document_No",
"SystemCreatedAt",
"__FK_value"
FROM "value" FK "__FK_value")
FROM JSON (wrap on) "root" PK "__KEY_root"
WITH CONNECTION(
URL "$(vBaseURL)/$(vTable)?$filter=ManualDeleteReason eq 'Incorrect Order'",
HTTPHEADER "Authorization" "Bearer $(vAccessToken)");
_ExclusionList:
LOAD
distinct [Document_No] as ExcludedOrders,
SystemCreatedAt as SysCreatedOrig,
Date(
ConvertToLocalTime(
Alt(
Timestamp#([SystemCreatedAt], 'YYYY-MM-DDThh:mm:ss[.fff]Z'),
Timestamp#([SystemCreatedAt], 'YYYY-MM-DDThh:mm:ss[.fff]+00:00')
),
'Europe/London'
),
'DD/MM/YYYY hh:mm'
) as SystemCreatedAt
RESIDENT RestConnectorMasterTable
WHERE NOT IsNull([__FK_value])
ORDER BY SystemCreatedAt asc;
drop table RestConnectorMasterTable;
exit script;
If I use the same functions on a live table, not an archive (and so only records in the table are post DST) - the function works correctly.
Can anyone shed some insight? I cant afford to dump the whole archive over the API each time.
Im experiencing a weird situation in qliksense where,
When I select a condition in my table,
=YearEnd>=NetDueDate
Qliksense always returns both positive and negative cases.(Video attached.)
The data is not getting filtered.
To give you a background,
I'm using an As-Of table(Report_Calendar) to calculate rolling sum of Amounts.
To calculate overdues, I'm comparing
1. YearEnd field from As-of table with NetDueDate in my fact(AccountsReceivable).
2. Clearing Date from my fact to YearEnd field in As-of table.
These conditions when used in set analysis were not returning expected results.
So I tried running the conditions individually and noticed this weird behaviour.
My data model looks like below,
Any help would be appreciated.
TIA,
Hi,
I have lika target a aws s3 bucket.
I want to limit the maz size of the files to write in the target to 3 mb. But I dont know where do this configuration.
Which of these options is suitable and what are their differences?
-In manage endpoint connection -> File attributes -> Maximum file size (KB)
-In manage endpoint connection -> Change processing -> Apply/store changes when: File size reaches(KB)
-Task Settings -> Change Processing Tunning -> Transaction Offload Tuning -> Offload transactions in progress to disk if -> Total transactions memory size exceeds (MB)
thanks,
Hello, I am having an issue right now where when I try to reorder the public sheets within my app, I am getting the following:
The app was originally created by a former employee. I have tried multiple things including making myself the app owner, duplicating the app to ensure it is mine, and some other things I've forgotten at this point. I am also the owner of the space that the app resides in.
Any help would be greatly appreciated. Thanks!
We are in the process of migrating Talend Studio projects from version 7.3 to 8.
The 7.3 jobs were deployed as Docker images in ECR. For some jobs, there are multiple versions in Git, and we are unable to identify which version was actually deployed.
We have pulled the Docker images as .tar files and attempted to import them into Talend Studio 8, but we encounter the following error:
“No valid items to import.”
From our investigation, it appears that these .tar files are Docker images containing compiled job runtimes, not Talend job export files.
Could you please advise on the best approach to extract or obtain the correct Talend job items from these Docker images so that they can be imported into our Talend Studio 8 for review and migration?
The documentation is not very clear on this specific request.
Creating QVS files
Create QVS files in a text editor outside of Qlik Cloud. Load script history can also be downloaded as QVS files.
Do the following:
Create a block of load script in an app or script.
Creating your load script in an app or script allows you to test the script first before adding it to a QVS file.
Alternatively, find a block of load script you want to reuse.
In a text editor, copy and paste the block of load script you want to reuse.
Save the load script as a file with the QVS extension.
You can now upload the QVS file to Qlik Cloud or a web storage provider.
Example:
The App will consume the data daily needed in the proper script format then store it to a data connection.
For Example:
Comment Field [Field1] With "Dimension | A note that describes field contents ";
Store QVS_1 into [lib://QVS_Files/Automation/Generate_Data.qvs]) (txt,delimiter is ';');
Then all Apps that need to load this script daily via: Must_Include statement.
What am I missing here?
FREE TRIAL
Data has stories to tell—let Qlik help you hear them. Start your FREE TRIAL today!
Register between December 1–7 and stack an extra $150 off on top of the existing $500 early bird savings. Use code: CYBERWEEK.
Trends 2026 Alert! Plug into this rewired framework for driving ROI from AI.
Independent validation for trusted, AI-ready data integration. See why IDC named Qlik a Leader.
With Qlik Public ILT Passport, embark on live sessions with our top trainers — helping you grow faster and get the most out of Qlik. See where your passport can take you.
Take advantage of this ISV on-demand webinar series that includes 12 sessions of detailed examples and product demonstrations presented by Qlik experts.
Your journey awaits! Join us by Logging in and let the adventure begin.
Afcons Infrastructure partnered with Qlik to unify its data and build dynamic dashboards, reducing review times from weeks to hours and enabling on-demand reporting across its global operations.
By implementing Qlik Analytics, Atria Convergence Technologies improves insight into churn, revenue, and downtime, driving faster decisions and more proactive customer support.
Hydronorth uses Qlik to replace disconnected data systems with a single source of truth, improving alignment, speeding up decisions, and reducing manual work.
With Qlik Cloud Analytics, Rexel boosts performance by enabling self-service analysis, reducing prep time, and giving 3,000 users access to trusted insights.
Join one of our Location and Language groups. Find one that suits you today!
Únete a la conversación con usuarios de Qlik en todo México: comparte ideas, haz preguntas y conéctate en español.
Qlik Communityの日本語のグループです。 Qlik製品に関する日本語資料のダウンロードや質問を日本語で投稿することができます。
Connectez-vous avec des utilisateurs francophones de Qlik pour collaborer, poser des questions et partager des idées.
The Qlik Product Recap showcases the newest features with quick demos and shareable resources to help you stay current?
You can test-drive Qlik for free? Try Qlik Talend Cloud to integrate and clean data without code, or explore Qlik Cloud Analytics to create AI-powered visualizations and uncover insights hands-on.
Salesforce’s acquisition of Informatica could put your flexibility on hold? Qlik makes it easy to keep your data initiatives moving forward.
You can move beyond batch processing and harness real-time data to power AI and faster decisions? Discover how in our new eBook, Mastering Change Data Capture.