Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi guys.
Now, i have a situation witch i belive not heva a solution, but i will ask...
There is way to make in script , sum a sales values between two dates...for example:
LOAD * INLINE
[STORE,DATE
001, 11/05/2016
002, 02/06/2016
003, 20/04/2016
002, 04/08/2016,
001, 11/09/2016,
003, 14/10/2016];
I have in other .qvd the sales by store and day, good, now i want make a logic in qlikview that will do this:
001, 11/09/2016 , $ 60000 --where this "$6000" is the sum between 11/05/2016 and 11/09/2016
002, 02/06/2016 , $ 3000 --where this "$ 3000" is the sum between 02/06/2016 and begin
003, 20/04/2016 $ 45790 --where this "$ 45790" is the sum between 20/04/2016 and begin
002, 04/08/2016, $ 5000 --where this "$ 5000 " is the sum between 02/06/2016 and 04/08/2016
001, 11/05/2016, $ 23456 --where this "$ 23456" is the sum between 02/06/2016 and begin
003, 14/10/2016 $ 78998 --where this "$ 78998" is the sum between 20/04/2016 and 14/10/2016
Understood?, I mean, i will sum sale between DATE and previous DATE found for that Store...
I can't think in nothig to resolve this solititation....
Thank you!!!
Hi Matheus,
Try this script:
[Store Dates]:
LOAD * INLINE
[STORE,DATE
001, 11/05/2016
002, 02/06/2016
003, 20/04/2016
002, 04/08/2016,
001, 11/09/2016,
003, 14/10/2016];
[Store Dates1]:
LOAD
STORE,
Date#(DATE) as ToDate,
If(STORE=Previous(STORE),Date(Date#(Previous(DATE)) +1),Date#( '01/04/2016') ) as FromDate
Resident [Store Dates] Order by STORE, DATE;
DROP Table [Store Dates];
This creates this table with proper intervals on which you can perform an IntervalMatch with extended syntax. I guessed at a begin date of 01/04/2016.
STORE | FromDate | ToDate |
---|---|---|
001 | 01/04/2016 | 11/05/2016 |
001 | 12/05/2016 | 11/09/2016 |
002 | 01/04/2016 | 02/06/2016 |
002 | 03/06/2016 | 04/08/2016 |
003 | 01/04/2016 | 20/04/2016 |
003 | 21/04/2016 | 14/10/2016 |
If your sales table has fields STORE and SALES then try an IntervalMatch along these lines:
IntervalMatch:
IntervalMatch (SALES,STORE) LOAD FromDate, ToDate, STORE resident [Store Dates1];
This will give you a synthetic key but don't be alarmed. This is to be expected and is not a problem. When QlikView creates a synthetic key as a result of a properly implemented IntervalMatch the synthetic key it creates is an efficient link table allowing intervals in one table to be associated with distinct values in another.
Good luck
Andrew
May be try with interval match ..
Hi Matheus,
Try this script:
[Store Dates]:
LOAD * INLINE
[STORE,DATE
001, 11/05/2016
002, 02/06/2016
003, 20/04/2016
002, 04/08/2016,
001, 11/09/2016,
003, 14/10/2016];
[Store Dates1]:
LOAD
STORE,
Date#(DATE) as ToDate,
If(STORE=Previous(STORE),Date(Date#(Previous(DATE)) +1),Date#( '01/04/2016') ) as FromDate
Resident [Store Dates] Order by STORE, DATE;
DROP Table [Store Dates];
This creates this table with proper intervals on which you can perform an IntervalMatch with extended syntax. I guessed at a begin date of 01/04/2016.
STORE | FromDate | ToDate |
---|---|---|
001 | 01/04/2016 | 11/05/2016 |
001 | 12/05/2016 | 11/09/2016 |
002 | 01/04/2016 | 02/06/2016 |
002 | 03/06/2016 | 04/08/2016 |
003 | 01/04/2016 | 20/04/2016 |
003 | 21/04/2016 | 14/10/2016 |
If your sales table has fields STORE and SALES then try an IntervalMatch along these lines:
IntervalMatch:
IntervalMatch (SALES,STORE) LOAD FromDate, ToDate, STORE resident [Store Dates1];
This will give you a synthetic key but don't be alarmed. This is to be expected and is not a problem. When QlikView creates a synthetic key as a result of a properly implemented IntervalMatch the synthetic key it creates is an efficient link table allowing intervals in one table to be associated with distinct values in another.
Good luck
Andrew
Andrew, It worked...
I did like you said and it worked....But i dont got it how use intervalmatch, but i did by other way, because i needed sum sales values between date begin_and date_end, but with intervalmatch i dont get it...But anyway, it worked a lot...Thank you and thanks for all who help....