Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Join Qlik Support's office-hours, recorded live, where our experts answer your questions and give more insight. This month's topic is: New to Qlik Cloud
Questions:
Resources:
Hi everyone,
Using this code blow :
WeekName([Calendar date],0,-4)I got the last week of the year 2025 from 25 december to 31 december 2025 as the 1st week of the year 2026 instead of the last year.
I tried this code below, but it didn't work:
if( date <= date('31/12/2025', 'DD/MM/YYYY') AND date >= date('25/12/2025', 'DD/MM/YYYY')
, 53,
if(date <= date('01/01/2026', 'DD/MM/YYYY') AND date >= date('07/01/2026', 'DD/MM/YYYY')
,1,Week(date, 0, -4)-1 )Is there another way to solve this? Thanks in advance.
Hello,
im using Talend Cloud Data Management Platform
Version: 8
Build id: 20250822_0619-patch
R2025-08
I am exporting a Talend job as a standalone job using the “Build Job” function in Talend Studio.
This process generates a ZIP archive that contains all required artifacts to run the job independently, including a shell script and context files.
During the export, it is possible to select a context. In our case, we use two contexts: PROD and DEV.
I exported the job with the PROD context selected. Inside the generated ZIP archive, a file named PROD.properties is present.
When I run the standalone job on a Linux (Ubuntu) virtual machine, the job does not use the configuration values from PROD.properties.
Instead, it continues to use the context values that are defined directly within the Talend job itself.
Running the job without parameters:
Running the job while explicitly specifying the context and the properties file:
In both cases, the job does not apply the values from PROD.properties.
When I override individual context variables directly via the command line using --context_param, the values are applied correctly. For example:
/home/administrator/talend_jobs/prod/demo/CassandraPomJob_0.1_PROD/CassandraPomJob/CassandraPomJob_run.sh \
Why is the standalone job not using the values from the PROD.properties file, even though the file exists in the exported ZIP and is explicitly referenced via --context_param_file?
What is the correct or recommended way to run a Talend standalone job so that it uses the values defined in a context .properties file?
Thanks a lot in advance.
Best regards,
Tugce
Hi,
I am trying to load context variable value from file data, getting null value. I don't want load File data via any talend components. Please check & suggest.
File Name : test.properties
File Data:
kafkaUser=testUser
I do need to create a security rule to give access to user who only they are the owner of the data connection, using QmcSection_DataConnection.
i have created a custom admin role "DataConnection" and assign it to this user, in the condition i did "((user.roles="DataConnection" and resource.owner.name=user.name))" but the user still able to see all data connection.
Hi All,
I have quite new to Qlik. I want to create 4 fields (SUM_1, SUM_2, SUM_3, SUM_4) to count distinct ID meets criteria by category. May I ask how to create 4 expressions in front end.
Criteria:
| SUM_1 | Criteria1=1; Criteria2 = 2 |
| SUM_2 | Criteria1=2; Criteria2 = 1 |
| SUM_3 | Criteria1=1; Criteria2 = 1 |
| SUM_4 | Criteria1=2; Criteria2 = 2 |
Existing table:
| ID | Category | Criteria1 | Criteria2 |
| 432175 | CateA | 1 | 2 |
| 135354 | CateA | 1 | 1 |
| 647587 | CateB | 1 | 2 |
| 154324 | CateB | 1 | 2 |
| 867599 | CateA | 1 | 1 |
| 245654 | CateA | 1 | 1 |
| 589799 | CateB | 2 | 1 |
| 546762 | CateA | 2 | 1 |
Target Table:
| Category | SUM_1 | SUM_2 | SUM_3 | SUM_4 |
| CateA | 1 | 1 | 3 | 0 |
| CateB | 2 | 1 | 0 | 0 |
Hi,
Talend Studio
Version: 8
Build id: 20250218_0945-patch
Sometimes developers close the merge results dialog by accident.
Is it possible to re-open the merge results dialog without re-doing the merge?
Thanks
Hello,
Can anyone please help me how can i get filter below data sets into two outputs using QLik.
Sample Data:
Date ID Amount
1/1/2025 AAA 2
2/1/2025 AAA 5
12/1/2025 AAA 2
12/1/2025 AAA 5
12/1/2025 AAA 2
23/1/2025 AAA 2
24/1/2025 AAA 5
Output1
Date ID Amount
1/1/2025 AAA 2
12/1/2025 AAA 2
12/1/2025 AAA 2
23/1/2025 AAA 2
OutPut2
Date ID Amount
2/1/2025 AAA 5
12/1/2025 AAA 5
24/1/2025 AAA 5
Thanks in Advance
I have a SQL Server source table that has 5 columns:
The MERGE statement looks like this (the names were changed to protect the innocent):
MERGE INTO "dbo"."my_table_name" T
USING (
SELECT *
FROM "public"."attrep_changes6004D92ED521268C"
WHERE "table_id" = 1
) S
ON (T."ASSIGNMENTID" = IFF("table_id" = 1, TRY_CAST(S."seg1" AS NUMBER(38, 0)), NULL))
AND (T."PERIODID" = IFF("table_id" = 1, TRY_CAST(S."seg2" AS NUMBER(38, 0)), NULL))
WHEN MATCHED AND "replicate_op" = 0 AND "table_id" = 1
THEN DELETE
WHEN MATCHED AND "replicate_op" <> 0 AND "table_id" = 1
THEN UPDATE SET
T."NumericColumn1" = IFF("table_id" = 1, TRY_CAST(S."col1" AS NUMBER(38, 0)), NULL),
T."NumericColumn2" = IFF("table_id" = 1, TRY_CAST(S."col2" AS NUMBER(38, 0)), NULL),
T."NumericColumn2" = IFF("table_id" = 1, TRY_CAST(S."col3" AS NUMBER(38, 0)), NULL),
T."DatabaseInsertDate" = IFF("table_id" = 1, TRY_CAST(S."col4" AS TIMESTAMP(3)), NULL),
T."DatabaseUpdateDate" = IFF("table_id" = 1, TRY_CAST(S."col5" AS TIMESTAMP(3)), NULL)
WHEN NOT MATCHED AND "replicate_op" <> 0 AND "table_id" = 1
THEN INSERT (
"NumericColumn1",
"NumericColumn2",
"NumericColumn3",
"DatabaseInsertDate",
"DatabaseUpdateDate"
)
VALUES (
IFF("table_id" = 1, TRY_CAST(S."col1" AS NUMBER(38, 0)), NULL),
IFF("table_id" = 1, TRY_CAST(S."col2" AS NUMBER(38, 0)), NULL),
IFF("table_id" = 1, TRY_CAST(S."col3" AS NUMBER(38, 0)), NULL),
IFF("table_id" = 1, TRY_CAST(S."col4" AS TIMESTAMP(3)), NULL),
IFF("table_id" = 1, TRY_CAST(S."col5" AS TIMESTAMP(3)), NULL)
);
I figured that the issue is caused by TRY_CAST and that the TRY_CAST is returning a NULL for my DatabaseInsertDate. The FULL load works fine, but from what I can see the full load is doing a COPY INTO and not running a TRY_CAST. I have tried adding a transformation on the Table Settings for this table to
strftime('%Y-%m-%d %H:%M:%f', $DatabaseInsertDate)
...but that has not helped.
The public.attrep_changes6004D92ED521268C table goes away as soon as the error occurs so I cannot see the data behind the issue.
Please help. Thanks
Hello all
I have a button in my Qlik Cloud app that executes an automation in triggered mode. When I publish the app to a managed space so that end users can trigger it themselves, they click the button, but the automation won't be executed.
It seems like only the owner of the automation is allowed to execute it.
I did everything in this video: https://youtu.be/uwtpmjumejc?si=9kLwfiOoab4Ywhr0 in manual mode and trigger mode, also nothing.
How can I resolve this? Also is this a limitation of the Analyzer License?
Thanks
Important Update: The automation trigger works inside the tenant but not in the embedded app. I forgot to mention that I'm embedding the app inside and iframe in my web app, and I think the limitation comes from this.
FREE TRIAL
Data has stories to tell—let Qlik help you hear them. Start your FREE TRIAL today!
Dare to unleash a new professional you in 2026. Join Connect April 13–15 and supercharge your skill sets.
See why Qlik was named a Leader in the 2025 Gartner® Magic Quadrant™ for Data Integration Tool.
Trends 2026 Alert! Plug into this rewired framework for driving ROI from AI.
Learn why legacy data platforms can’t meet today’s AI and analytics demands and how Qlik provides a modern, independent, future-ready alternative.
Solve the real-world need for real-time data with Change Data Capture (CDC). What is CDC and how to master it?
Your journey awaits! Join us by Logging in and let the adventure begin.
Afcons Infrastructure partnered with Qlik to unify its data and build dynamic dashboards, reducing review times from weeks to hours and enabling on-demand reporting across its global operations.
By implementing Qlik Analytics, Atria Convergence Technologies improves insight into churn, revenue, and downtime, driving faster decisions and more proactive customer support.
Hydronorth uses Qlik to replace disconnected data systems with a single source of truth, improving alignment, speeding up decisions, and reducing manual work.
With Qlik Cloud Analytics, Rexel boosts performance by enabling self-service analysis, reducing prep time, and giving 3,000 users access to trusted insights.
Join one of our Location and Language groups. Find one that suits you today!
Únete a la conversación con usuarios de Qlik en todo México: comparte ideas, haz preguntas y conéctate en español.
Qlik Communityの日本語のグループです。 Qlik製品に関する日本語資料のダウンロードや質問を日本語で投稿することができます。
Connectez-vous avec des utilisateurs francophones de Qlik pour collaborer, poser des questions et partager des idées.
The Qlik Product Recap showcases the newest features with quick demos and shareable resources to help you stay current?
You can test-drive Qlik for free? Try Qlik Talend Cloud to integrate and clean data without code, or explore Qlik Cloud Analytics to create AI-powered visualizations and uncover insights hands-on.
Salesforce’s acquisition of Informatica could put your flexibility on hold? Qlik makes it easy to keep your data initiatives moving forward.
You can move beyond batch processing and harness real-time data to power AI and faster decisions? Discover how in our new eBook, Mastering Change Data Capture.