Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I have a dimension called ''Exercice période'' in French which mean Period exercice. I want to systematically select the previous month of the period, as an example, we're in May, I want to select April.
Any expression for that please?
Thank you so much
Dear colleagues
we use Qlik sense enterprise on windows 2023May currently.
If we want to upgrade to February 2024 what is correct steps and is there a manual I can follow.
whether we need backup before upgrade.
we also hope to be sure license also can be used after upgrade.
it is better can get your support ASAP.
😊
jens.
I am looking to remove either the entire button, or the capabilities of the analyzer license to access the following in an app. If the entire button cannot be removed or hidden in the app, I would like to restrict analyzer licenses from it.
- "Prepare Reporting" button in header/selection bar
- "Narrate Storytelling" button in header/selection bar
- Selections Tool in header/selection bar
I have Qlik Sense Enterprise SaaS. I would like the changes to be applied to the app itself, so that any analyzer license that accesses the app cannot use the buttons/functions.
Dear Qlik NPrinting experts,
My goal with this post is to know how to get the user id created by the Qlik NPrinting POST Users REST API method inside a Qlik Sense load script.
Environment:
Qlik Sense: May 2023 SR4
Qlik NPrinting: May 2023 SR1
I followed the instructions documented in the How to use Qlik NPrinting APIs inside a Qlik Sense load script post created by @Gianluca_Perin and the user creation via Qlik Sense load script implementing Qlik NPrinting REST APIs worked perfectly well. Although my new user is created as expected, I need to know why I don't get the user id returned to my Qlik Sense script.
I followed the documentation above and the table remains empty after creating the user.
Here is my Qlik Sense script and the data connection configuration is also attached:
/*** lOGIN ***/
LIB CONNECT TO 'REST_NPrinting_GET';
//Perform a GET call to NPrinting NTLM login API
RestConnectorMasterTable:
SQL SELECT
"Set-Cookie",
"__KEY__response_header"
FROM JSON "_response_header" PK "__KEY__response_header";
[_response_header]:
LOAD
[Set-Cookie] AS [Set-Cookie]
RESIDENT RestConnectorMasterTable
WHERE NOT IsNull([__KEY__response_header]);
//Extracts session cookie from the API response
// let vCookieRaw = Peek('Set-Cookie',0,'_response_header');
// let vCookie = TextBetween('$(vCookieRaw)','Secure,','Path=/',2);
let vCookieRaw = Peek('Set-Cookie',0,'_response_header');
let vCookie = TextBetween('$(vCookieRaw)', 'SameSite=None,', 'Path=/', 3);
DROP TABLE RestConnectorMasterTable;
//----------------------------------------------------------------
/*** CREATE USER ***/
LIB CONNECT TO 'REST_NPrinting_POST';
set vBody = '{"Username":"UserA","Email":"usera@qlik.com","Password":"123","Enabled":"true","Folder":"","Subfolder":"","DomainAccount":"","Timezone":"Europe/Berlin","Locale":"en"}';
let vBody = replace(vBody,'"', chr(34)&chr(34));
set vPostUserULR = 'https://mynprintingserver:4993/api/v1/users';
set vQSServer = 'https://myqliksenseserver';
RestNPPOSTandPUTTestTable:
SQL SELECT
"__KEY_data"
FROM JSON (wrap off) "data" PK "__KEY_data"
WITH CONNECTION( URL "$(vPostUserULR)", BODY "$(vBody)",
HTTPHEADER "Origin" "$(vQSServer)",
HTTPHEADER "Content-Type" "application/json",
HTTPHEADER "cookie" "$(vCookie)");
// [post_and_put_items]:
// LOAD [__KEY_data] AS [__KEY_data]
// RESIDENT RestNPPOSTandPUTTestTable
// WHERE NOT IsNull([__KEY_data]);
// DROP TABLE RestNPPOSTandPUTTestTable;
It would be really great if anyone could shed some light on this issue.
Thanks in advance.
Best regards,
Huberto Pereira Haidemann
Hi ,
I need to parse and load DB a pipe delimited file having data in the below structure using Talend.
Col1|Col2|Col3
"abc|111"|100|"zzz"
"xyz|222"|200|"yyy"
I am using dynamic schema as there are are multiple files with different schemas.(but all files follow the above rules).
Issue : It is note able to parse the entire content inside double quotes (abc|111) as single field.
Please help.
Thanks
hii,
I need to create a key with 5 fields in comun but i need to keep those fields on both tables when loading.
Saleskit:
LOAD
"Year",
"Sales Country",
"Sales Manager",
"Sales rep",
"Sales Role",
"LATAM Lara BP code",
"Master Customer Name",
"Customer name",
Direction,
"Vol. Budget (Teus)",
Segmentation,
Category,
"Month",
"Week"
FROM [lib://External source files/saleskit.qvd]
(qvd);
Database:
LOAD
Service,
Voyage,
Vessel,
Booking,
BL,
"Year",
"Month",
"Week",
"LATAM Lara BP code",
Direction
FROM [lib://External source files/export.qvd]
(qvd);
They key needs to be with the fiels ("Year"&"Month"&"Week"&"LATAM Lara BP code"&Direction).
The problem is that if i keep all the 5 fields on both table the Qlik will create another synthetic key automaticaly, which i cannot use.
and i cannot not load all the 5 fields from both tables, because i need a "full outer join".
thankss, Karine.
Hello everybody!
I'm setting up an integration with Json Web Token (JWT) and I want to create sessions longer than an hour, is this possible?
The payload I am using is the following:
const payload = {
jti: uid.sync(32),
sub: `${userName}`, // Usuário obtido do banco de dados
subType: "user",
name: `${userName}`,
email: `${email}`,
email_verified: true,
iat: Math.floor(Date.now() / 1000),
exp: Math.floor(Date.now() / 1000) + 60 * 60, //Here I want to increase the timeout to more than an hour
nbf: Math.floor(Date.now() / 1000),
iss: "myISS",
aud: "qlik.api/login/jwt-session",
groups: [
"Analytics Admin",
"Data Admin",
"Data Space Creator",
"Developer",
"Managed Space Creator",
"Shared Space Creator",
"Tenant Admin",
],
};
Qlik Compose for Data Warehouses
Hi All,
I have run the below command in the bin folder of the Qlik server which is installed in F drive.
My outfolder will be D drive.
ComposeCli.exe export_csv --project WH_Master --outfolder D:\WHPUAT
This command exported all csv files related Model, Mappings and Data Marts only.
It didn't export any files related to Datawarehouse tasks and SQL files and also showed an error
SYS-E-HTTPFAIL This given path's format is not supported..
Could you please help me with this error.
Thanks!
Hello Experts
Our customer has upgraded v2022.05 to v2023.11 and encountered the problem below:
In 2022.05, when the following warning occurred, CDC task continued without any errors.
[TARGET_LOAD ]I: Load finished for table 'ADM'.'XXXXX' (Id = 1). 85627944 rows received. 0 rows skipped. Volume transferred 98611582288. (streamcomponent.c:3976)
[TARGET_LOAD ]I: TPT statistics for table 'REP.YYYYY': Operator=LOAD, CPU time=1115.296875, Received Rows=85627944, Sent Rows=85627944, Applied Rows=85627917 (teradata_tpt_engine.cpp:778)
[TARGET_LOAD ]W: Not all rows have been loaded successfully for 'REP'.'YYYYY' due to possible 'TPT LOAD' errors. Consult Target 'xxxx_E?' / 'xxxx_L?' Error / Log tables contents. (teradata_tpt_engine.cpp:786)
[TARGET_LOAD ]I: teradata_terminate_tpt(...) successfully applied conn->Terminate() method for task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_tpt_engine.cpp:839)
[TASK_MANAGER ]I: Loading finished for table 'ADM'.'XXXXX' (Id = 1) by subtask 1. 85627944 records transferred. (replicationtask.c:3012)
[TARGET_LOAD ]I: teradata_stop_imp(...) stopping task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:857)
[TARGET_LOAD ]I: teradata_free(...) freeing resources for task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:878)
[TARGET_LOAD ]I: teradata_disconnect(...) disconnecting for task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:92)
[TARGET_LOAD ]I: teradata_disconnect(...) disconnecting Data Connection succeeded for task identified by '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:104)
[TARGET_LOAD ]I: teradata_disconnect(...) No active Lookup Connection to disconnect from was detected for task identified by '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:122)
[TASK_MANAGER ]I: Subtask #1 ended (replicationtask_util.c:591)
However the behavior has changed after upgrading to v2023.11.
After the same warning occurred, it is detected as an error and the table is suspended.
[TARGET_LOAD ]I: Load finished for table 'ADM'.'XXXXX' (Id = 1). 85986652 rows received. 0 rows skipped. Volume transferred 99014228864. (streamcomponent.c:4076)
[TARGET_LOAD ]I: TPT statistics for table 'REP.YYYYY': Operator=LOAD, CPU time=1126.937500, Received Rows=85986652, Sent Rows=85986652, Applied Rows=85986625 (teradata_tpt_engine.cpp:781)
[TARGET_LOAD ]W: Not all rows have been loaded successfully for 'REP'.'YYYYY' due to possible 'TPT LOAD' errors. Consult Target 'xxxx_E?' / 'xxxx_L?' Error / Log tables contents. (teradata_tpt_engine.cpp:789)
[TARGET_LOAD ]I: teradata_terminate_tpt(...) successfully applied conn->Terminate() method for task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_tpt_engine.cpp:846)
[TARGET_LOAD ]E: TPT data acquisition failed [1020403] (teradata_endpoint_imp.c:1480)
[TARGET_LOAD ]E: Handling End of table 'REP'.'YYYYY' loading failed by subtask 1 thread 1 [1025000] (endpointshell.c:3069)
[TASK_MANAGER ]W: Table 'ADM'.'XXXXX' (subtask 1 thread 1) is suspended. TPT data acquisition failed; Handling End of table 'REP'.'YYYYY' loading failed by subtask 1 thread 1 (replicationtask.c:3208)
Is there any options to ignore the warnings/errors and continue the task like v2022.05?
Any advice would be appreciated.
Regards,
Kyoko Tajima
I have a SQL Server database that has merge replication configured with another SQL Server, I need to replicate some tables using Qlik Replicate, we know that CDC cannot be activated, I would like to know what the restrictions are in relation to transactional replication, if This impacts merge replication that already exists, if Qlik uses other logs than the one already being used by MERGE.
Wondering about Qlik Talend Data Integration Sessions? There are 11, in addition to all of the Data & Analytics. So meet us in Orlando, June 3 -5.
Join us on May 15th at 11 AM ET to discuss the Qlik Ideation Process. Bring your questions.
Browse our helpful how-to's to learn more about navigating Qlik Community and updating your profile.
Your journey awaits! Join us by Logging in and let the adventure begin.
Qlik enables a frictionless migration to AWS cloud by Empresas SB, a group of Chilean health and beauty retail companies employing 10,000 people with 600 points of sale.
Qlik Luminary Stephanie Robinson of JBS USA, the US arm of the global food company employing 70,000 in the US, and over 270,000 people worldwide.
Join one of our Location and Language groups. Find one that suits you today!
A private group is for healthcare organizations, partners, and Qlik healthcare staff to collaborate and share insights..
Qlik Communityの日本語のグループです。 Qlik製品に関する日本語資料のダウンロードや質問を日本語で投稿することができます。
Welcome to the group for Brazil users. .All discussions will be in Portuguese.
Hear from your Community team as they tell you about updates to the Qlik Community Platform and more!