Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
hii,
I need to create a key with 5 fields in comun but i need to keep those fields on both tables when loading.
Saleskit:
LOAD
"Year",
"Sales Country",
"Sales Manager",
"Sales rep",
"Sales Role",
"LATAM Lara BP code",
"Master Customer Name",
"Customer name",
Direction,
"Vol. Budget (Teus)",
Segmentation,
Category,
"Month",
"Week"
FROM [lib://External source files/saleskit.qvd]
(qvd);
Database:
LOAD
Service,
Voyage,
Vessel,
Booking,
BL,
"Year",
"Month",
"Week",
"LATAM Lara BP code",
Direction
FROM [lib://External source files/export.qvd]
(qvd);
They key needs to be with the fiels ("Year"&"Month"&"Week"&"LATAM Lara BP code"&Direction).
The problem is that if i keep all the 5 fields on both table the Qlik will create another synthetic key automaticaly, which i cannot use.
and i cannot not load all the 5 fields from both tables, because i need a "full outer join".
thankss, Karine.
Hello everybody!
I'm setting up an integration with Json Web Token (JWT) and I want to create sessions longer than an hour, is this possible?
The payload I am using is the following:
const payload = {
jti: uid.sync(32),
sub: `${userName}`, // Usuário obtido do banco de dados
subType: "user",
name: `${userName}`,
email: `${email}`,
email_verified: true,
iat: Math.floor(Date.now() / 1000),
exp: Math.floor(Date.now() / 1000) + 60 * 60, //Here I want to increase the timeout to more than an hour
nbf: Math.floor(Date.now() / 1000),
iss: "myISS",
aud: "qlik.api/login/jwt-session",
groups: [
"Analytics Admin",
"Data Admin",
"Data Space Creator",
"Developer",
"Managed Space Creator",
"Shared Space Creator",
"Tenant Admin",
],
};
Qlik Compose for Data Warehouses
Hi All,
I have run the below command in the bin folder of the Qlik server which is installed in F drive.
My outfolder will be D drive.
ComposeCli.exe export_csv --project WH_Master --outfolder D:\WHPUAT
This command exported all csv files related Model, Mappings and Data Marts only.
It didn't export any files related to Datawarehouse tasks and SQL files and also showed an error
SYS-E-HTTPFAIL This given path's format is not supported..
Could you please help me with this error.
Thanks!
Hello Experts
Our customer has upgraded v2022.05 to v2023.11 and encountered the problem below:
In 2022.05, when the following warning occurred, CDC task continued without any errors.
[TARGET_LOAD ]I: Load finished for table 'ADM'.'XXXXX' (Id = 1). 85627944 rows received. 0 rows skipped. Volume transferred 98611582288. (streamcomponent.c:3976)
[TARGET_LOAD ]I: TPT statistics for table 'REP.YYYYY': Operator=LOAD, CPU time=1115.296875, Received Rows=85627944, Sent Rows=85627944, Applied Rows=85627917 (teradata_tpt_engine.cpp:778)
[TARGET_LOAD ]W: Not all rows have been loaded successfully for 'REP'.'YYYYY' due to possible 'TPT LOAD' errors. Consult Target 'xxxx_E?' / 'xxxx_L?' Error / Log tables contents. (teradata_tpt_engine.cpp:786)
[TARGET_LOAD ]I: teradata_terminate_tpt(...) successfully applied conn->Terminate() method for task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_tpt_engine.cpp:839)
[TASK_MANAGER ]I: Loading finished for table 'ADM'.'XXXXX' (Id = 1) by subtask 1. 85627944 records transferred. (replicationtask.c:3012)
[TARGET_LOAD ]I: teradata_stop_imp(...) stopping task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:857)
[TARGET_LOAD ]I: teradata_free(...) freeing resources for task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:878)
[TARGET_LOAD ]I: teradata_disconnect(...) disconnecting for task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:92)
[TARGET_LOAD ]I: teradata_disconnect(...) disconnecting Data Connection succeeded for task identified by '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:104)
[TARGET_LOAD ]I: teradata_disconnect(...) No active Lookup Connection to disconnect from was detected for task identified by '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_endpoint_imp.c:122)
[TASK_MANAGER ]I: Subtask #1 ended (replicationtask_util.c:591)
However the behavior has changed after upgrading to v2023.11.
After the same warning occurred, it is detected as an error and the table is suspended.
[TARGET_LOAD ]I: Load finished for table 'ADM'.'XXXXX' (Id = 1). 85986652 rows received. 0 rows skipped. Volume transferred 99014228864. (streamcomponent.c:4076)
[TARGET_LOAD ]I: TPT statistics for table 'REP.YYYYY': Operator=LOAD, CPU time=1126.937500, Received Rows=85986652, Sent Rows=85986652, Applied Rows=85986625 (teradata_tpt_engine.cpp:781)
[TARGET_LOAD ]W: Not all rows have been loaded successfully for 'REP'.'YYYYY' due to possible 'TPT LOAD' errors. Consult Target 'xxxx_E?' / 'xxxx_L?' Error / Log tables contents. (teradata_tpt_engine.cpp:789)
[TARGET_LOAD ]I: teradata_terminate_tpt(...) successfully applied conn->Terminate() method for task identified by UUID '74a9fd96-8335-5944-bb67-6aaa83fb3782' (teradata_tpt_engine.cpp:846)
[TARGET_LOAD ]E: TPT data acquisition failed [1020403] (teradata_endpoint_imp.c:1480)
[TARGET_LOAD ]E: Handling End of table 'REP'.'YYYYY' loading failed by subtask 1 thread 1 [1025000] (endpointshell.c:3069)
[TASK_MANAGER ]W: Table 'ADM'.'XXXXX' (subtask 1 thread 1) is suspended. TPT data acquisition failed; Handling End of table 'REP'.'YYYYY' loading failed by subtask 1 thread 1 (replicationtask.c:3208)
Is there any options to ignore the warnings/errors and continue the task like v2022.05?
Any advice would be appreciated.
Regards,
Kyoko Tajima
I have a SQL Server database that has merge replication configured with another SQL Server, I need to replicate some tables using Qlik Replicate, we know that CDC cannot be activated, I would like to know what the restrictions are in relation to transactional replication, if This impacts merge replication that already exists, if Qlik uses other logs than the one already being used by MERGE.
I have two fields like
Order Quantity and Net price Net price in Decimal so I want to include Decimals also. If I use Round Or Floor or Ciel I will miss or Get More Spend.
Output should be like This
OrderId | Order Quantity | Net Price GC T | PO Sub Total ( order qty * Net Price GCT) | |
101 | 500 | 6.04 | 3020 | |
101 | 100 | 14.39 | 1439 | |
2.9B | ||||
Boa tarde, Estou tentando fazer uma comparação entre o período atual - período passado / período passado * 100, o objetivo é conseguir uma variação de vendas em %, por exemplo, em meu filtro estou comparando o mesmo período em anos diferentes Fevereiro de 2023 e Fevereiro de 2024
A EXPRESSÃO QUE ESTOU USANDO É:
( ( Sum({<Ano_DC={"$(VAnoAtual_DC)"}, Mes_DC={"$(VMesAtual_DC)"}>} TOTAL_ITEM_DC) - Sum({<Ano_DC={" $(VAnoAnterior)"}, Mes_DC={"$(VMesAtual_DC)"}>} TOTAL_ITEM_DC) ) ) / Soma({<Ano_DC= {"$(VAnoAtual_DC)"}, Mes_DC={"$(VMesAtual_DC)"}> } TOTAL_ITEM_DC)
ENVIANDO AS VARIAVEIS
VAnoAtual_DC:
=MAX(Ano_DC)
VAnoAnterior:
=MAX(Ano_DC) -1
VMesAtual_DC:
=MAX(Mes_DC)
Is it possible to do an interval match with two set of intervals? In my fact data, I have the field month end, which looks in the interval Effective Start and Effective Stop. In that same interval match function, can I have the Sales field from my fact data, look in the interval Min Threshold and Max Threshold as well? This is ultimately so I can find out what scale to give my fact data, based on if the sales is between a certain amount and if the sales took place between a certain date range.
Here is what my two tables look like.
Fact Data:
Sales | MonthEnd | Employee |
$40 | 3/30/3024 | A |
$25 | 1/31/2024 | B |
$81 | 1/31/2024 | A |
$78 | 1/31/2024 | C |
$110 | 4/30/2024 | A |
$2 | 4/30/2024 | A |
$179 | 3/31/2024 | B |
$63 | 2/29/2024 | C |
$92 | 2/29/2024 | A |
Dimension/Intervals Data:
Effective Start | Effective Stop | Min Threshold | Max Threshold | Scale |
1/1/2024 | 2/29/2024 | 46 | 100,000,000 | Pass |
1/1/2024 | 2/29/2024 | 0 | 45 | Fail |
3/1/2024 | 3/30/2024 | 51 | 100,000,000 | Pass |
3/1/2024 | 3/30/2024 | 0 | 50 | Fail |
4/1/2024 | 12/31/9000 | 0 | 130 | Fail |
4/1/2024 | 12/31/9000 | 131 | 100,000,000 | Pass |
I have just installed Feb 2024 on a lab server for testing.
Have imported one of our production apps (from Feb 2022 cluster) which uses derived fields for calendar dimensions using the DECLARE FIELD DEFINITION....FIELDS and DERIVE FIELDS FROM FIELDS .... USING .... script construction
There seems to be a bug with the derived fields - it recognises them and shows them in field list but in expression editor it reports an error in expression with red border round field box and when using the derived fields in conjunction with a function - in this case the CLASS function it does not work and returns an error
This all works 100% OK in our Feb 2022 production environment
Has anyone else encountered this ?
Hi all,
I have this sample dataset from one source. I want to show in a straight chart, just the Store and Systems columns. But I want them to be filtered by the dimension Function that has the value of only 'Fixed Ops'. Any help here would be appreciated, thanks in advance!
Wondering about Qlik Talend Data Integration Sessions? There are 11, in addition to all of the Data & Analytics. So meet us in Orlando, June 3 -5.
Join us on May 15th at 11 AM ET to discuss the Qlik Ideation Process. Bring your questions.
Browse our helpful how-to's to learn more about navigating Qlik Community and updating your profile.
Your journey awaits! Join us by Logging in and let the adventure begin.
Qlik enables a frictionless migration to AWS cloud by Empresas SB, a group of Chilean health and beauty retail companies employing 10,000 people with 600 points of sale.
Qlik Luminary Stephanie Robinson of JBS USA, the US arm of the global food company employing 70,000 in the US, and over 270,000 people worldwide.
Join one of our Location and Language groups. Find one that suits you today!
A private group is for healthcare organizations, partners, and Qlik healthcare staff to collaborate and share insights..
Qlik Communityの日本語のグループです。 Qlik製品に関する日本語資料のダウンロードや質問を日本語で投稿することができます。
Welcome to the group for Brazil users. .All discussions will be in Portuguese.
Hear from your Community team as they tell you about updates to the Qlik Community Platform and more!