Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I am using the below formula in a pivot table (I use two dimensions MATERIAL and GERENCIA):
sum({
<FECHA= {">=$(=Date(MonthStart(Max(FECHA)), 'DD/MM/YYYY'))<=$(=Date(Max(FECHA), 'DD/MM/YYYY'))"}>
} SALES) /
sum({
<FECHA= {">=$(=Date(MonthStart(Max(FECHA)), 'DD/MM/YYYY'))<=$(=Date(Max(FECHA), 'DD/MM/YYYY'))"}>
} Aggr([Días],FECHA,MATERIAL,GERENCIA)) *
Max(Aggr(If(FECHA= Max(total FECHA), TotalDías),FECHA,MATERIAL,GERENCIA))
This formula works correctly if a remove of the pivot table one of the two dimensions and if I remove of the formula the dimensions I am not using, but the problem is when I use both dimensions the calculation goes wrong, because if I put firstly GERENCIA and the MATERIAL as dimensions the results of the MATERIAL are correct but for GERENCIA is wrong, specifically I have two values in MATERIAL and for any reason I don´t understand the total value of GERENCIA is divided for two (two distinct values of MATERIAL field) and if I use first MATERIAL and then GERENCIA the values of GERENCIA are calculated correctly but the total value for MATERIAL are divided for the distinct values I have in the GERENCIA field, in this case MATERIAL is divided in 8, but if the first dimension used in the pivot table is not divided for the distinct values of the second dimension the value of the first dimension would be calculated correctly.
So, I appreciate if any of you have any idea to resolve this problem using aggr function with more then one dimension in a pivot table.
Thanks.
Good day Qlik Community,
It's me again (not Mario, but Igor). Contrary to what my initials say (Alcantara, Igor) I am not an AI but I do like AI. If you read my previous articles in Data Voyagers(BLOG | Data Voyagers), you know that I writing about my experience with Qlik Answers. This is another one of those.
In theory, and that is what I have being telling customers, Qlik Answers only works with unstructured data. However, the 2nd most common question I answer is "can Qlik Answers reads structured data like Excel or CSV?"(the first is: "Can I export to Excel?").
Well, this time I decided to test it and to explore how I could make the impossible, possible.
It is all here. I hope you all enjoy it. If so, please like it and share it.
Can we make Qlik Answers talk to Excel? (datavoyagers.net)
Hi all
I've already made a a comparison revenue sheet with 2 alternate states - Period 1 an Period 2 (year & months)
Now there is a request to add same store filter that will filter only stores that were opened in both of the alternate states periods and where opened during the time frame, which is indicated in the following table:
Branch, Open Date, Close Date
Basically in order the branch will be marked as "same store" it needs to be open in that time frame that i choose as period 1 and period 2
1. I've started by creating the following table which have list of all the dates per Branch between the opening and closing
NoConcatenate
SS_Branches_tmp:
Load
INDEX_Branch,
date(OpenStoreDate,'DD/MM/YYYY') as SS_Branches_OpenDate,
if(year(date(CloseStoreDate,'DD/MM/YYYY'))=1899,date(today(),'DD/MM/YYYY'),date(CloseStoreDate,'DD/MM/YYYY')) as SS_Branches_CloseDate_tmp
resident Branches
order by INDEX_Branch DESC
;
NoConcatenate
SaleDates_tmp:
load Distinct
INDEX_Branch as INDEX_Branch,
Date_Key as SS_Branches_Date
resident KeyTable;
left Join
SS_Branches:
load
INDEX_Branch,
date(SS_Branches_OpenDate+ IterNo()-1 ,'DD/MM/YYYY') as SS_Branches_Date,
'SS' as SS_Branches_Type
resident SS_Branches_tmp
while SS_Branches_OpenDate+ IterNo()-1 <= SS_Branches_CloseDate_tmp;
;
NoConcatenate
SaleDates:
load
INDEX_Branch&'|'&SS_Branches_Date as INDEX_SSDates,
INDEX_Branch as SS_Branches_Branch,
SS_Branches_Date,
if(len(SS_Branches_Type)>0,'SS','NOT SS') as SS_Branches_Type,
if(len(SS_Branches_Type)>0,1,0) as SS_Branches_TypeNum
resident SaleDates_tmp;
drop table SaleDates_tmp;
Drop Table SS_Branches_tmp;
2. Then i've created the following filter in the UI - basicaly its counting the working days in each selected period (Period 2) and counting for each branch it's SS (Same store) days the days between it's opening and closing
if the working dates are greater than the SS its not Same store
aggr(
if(
COUNT( TOTAL {1<Year=Period2::Year,Month=Period2::Month,Week=Period2::Week,Date=Period2::Date,INDEX_Branch=>} Date)
>
AGGR(COUNT(DISTINCT {<Year=Period2::Year,Month=Period2::Month,Week=Period2::Week,Date=Period2::Date,SS_Branches_Type={'SS'}>} SS_Branches_Date),Branch)
,'Not SS','SS'),Branch)
now, the problem is that it's now filtering all the data in the sheet - only if i put it into pivot table and use the above filter as a dimension with "not to include null value"
please help me to implement it so it will filter properly
BR
Lev
Hi,
I want to delete into DocumentDB collection some documents by _id. Some _id are string type but others are ObjectId type.
I can delete _id with String type but I cannot remove the documents with ObjectId.
Can you help me?
I also tried to use MongoDB components to conntct DocumentDB database but Talend Studio append to find statement the noCursorTimeout method and DocumentDB not support this feature, find method fails.
Gold Client BW is a versatile solution designed to streamline day to day business activities. Gold Client BW helps reduce the need of full system refreshes, provides the ability to copy queries and hierarchies to non-Production systems, and aids in troubleshooting production support issues. The Gold Client BW tool will save time, reduce manual effort, and allows users focus on what matters most.
Usage examples of Gold Client BW are listed in Gold Client BW Usage Examples.
Hi,
I am integrating Qlik Sense application to my web application via ticketing.
Now I have users created in QMC when they try to access the application but they get rejected with 'You cannot access Qlik Sense because you have no access pass.' message.
I created a security rule to grant the users in a specific user directory to the app but it still doesn't work.
Any ideas?
Thanks.
Hi team - we have a requirement for store the datetime field from 12 hrs to 24hr format.
I'm sending from source Sybase as 2024-10-04 12:00:00.000 AM.
In Db2, want to store as 2024-10-04 24:00:00.000
Hi Everyone
I have installed Qliksense in my local machine. I recently noticed that the scheduler service is stopped. When I tried to start the service it starts and then stops abruptly. When I checked the scheduler logs I found this "Startup sequence failed. Shutting down to avoid any inconsistent state. Please try to restart the Scheduler." I tried to restart the services and then tried to manually stop all the services and then started them but still it fails. Can anyone guide me how can I address this issue.
Regards
Sivanesan
Qlik Compose for Data Warehouses
We are using compose for data warehouse for building various DIMs and FACTs for analytical requirements. In production the compose for data warehouse is installed on an EC2 instance in east1 region. When it comes to DR scenario, the procedure that we normally follow is that, we provision a new EC2 instance in East2, then deploy the code, then create the DW and DM and generate the ETL instruction by pointing to a new schema. Once it is generated, then we will update the connection back to the original DW and DM schema and regenerate the ETL instruction again to reflect the connection change. Since we are using snowflake DB for building our DIMs and FACTs, all these metadata operations are relatively slow and this whole building and regenerating the ETL instructions normally takes around 2 to 3 hours.
Is there an easier way to handle this scenario when it comes to HA or DR situation. Do we have an option to copy the data folder from east1 to east2 (new ec2 for DR) server to avoid the rebuild and generate ETL instructions. This way we can considerably reduce the recovery time?
Any help on this topic is greatly appreciated.
I have two questions.
1. What's the maximum rollover size for the logstream. Ours seems to be 1G (1000000000). We use more than that in a day.
2. If I have multiple tasks using the same logstream source, will one task remove files before another task is finished?
this is related to the Log retention period (min):
We also have seen this in similar cases when using the same CDC changes table one task may delete rows that are not yet processed by the second task.
This will also occur if one task was stopped and the other is still running, there is no relation to stopping the DB or SAP components.
If the two Tasks do not share the same table set, it is recommended to use a different source endpoint that uses different CDC changes tables
Pencil in our most magical conference yet: 3 days of full immersion in data, analytics, and AI. May 13-15 | Orlando, FL
Fuel your data and AI journey with the right services, delivered by our experts.
With Qlik Answers™, get personalized, GenAI-driven answers from unstructured data, so you can make decisions with confidence. Learn how in our product roadmap keynote and demo.
Browse our helpful how-to's to learn more about navigating Qlik Community and updating your profile.
Your journey awaits! Join us by Logging in and let the adventure begin.
Qlik enhances decision-making with high-speed insights, as Mayborn Group integrates data from various functions across their global operations, gaining a competitive edge in the childcare industry.
Nortera leads agricultural manufacturing analytics and automation with Qlik, reducing short-shipment rates and annual savings in water consumption.
Qlik Data Integration transforms Airbus' aircraft production, leading to over 150 replication tasks and informing more efficient analysis.
Join one of our Location and Language groups. Find one that suits you today!
A private group is for healthcare organizations, partners, and Qlik healthcare staff to collaborate and share insights..
Qlik Communityの日本語のグループです。 Qlik製品に関する日本語資料のダウンロードや質問を日本語で投稿することができます。
Welcome to the group for Brazil users. .All discussions will be in Portuguese.
Hear from your Community team as they tell you about updates to the Qlik Community Platform and more!