Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
razvan_brais
Creator III
Creator III

Previous month sales in loading script

Hy guys. Hope you can help me. 

 I need to load from script in the same table current month sales , but I need to load previous month sales too.

 The script looks something like this : 
 

LOAD

CLIENT_ID,
CLIENT_NAME,
LOCATION_ID,
ApplyMap('MapLocations', Month, 'no_location') AS LOCATION_NAME,
SEGMENT,
BUSSINES_FLG,
NPL,
RATING,
MakeDate([YEAR], [Month]) AS HIST_DATE,
NUM([Month]) AS No_Month,
Month(MakeDate(2000,NUM([Month]), 1)) AS MonthMonth,
SALES,
Previous(SALES) as PREV_SALES

FROM TABLE;

 

The problem is when I`m comparing current sales for January with previous month December... 

I don`t want to do this in SetAnalysis  , I would like to have them calculated from the loading script.

Can you please help me with some hints?

Thank you,

Razvan

36 Replies
razvan_brais
Creator III
Creator III
Author

@Vegar ,

 By doing left join I will lose some client sales for previous_sales:

 Example : If in JAN 2020 I have sales for 4 clients and in Dec 2019 I have sales for 6 clients ( same 4 clients from JAN 2020 and other 2 clients that appear only in Dec 2019 ) the values for JAN 2020 will be kept only for those 4 clients.

 By doing only JOIN between both tables the qvd grows from 1GB to 3.5GB 😞 

Thank you,

Razvan

Vegar
MVP
MVP

I was aware of the fact of loosing certain combination, I thought that was what you wanted when reading one of your previous comments. You can avoid these gaps using a normal join.

Remember that duplicating the number of rows in your qvd does not necessarily double the amount of RAM consumed in your application, due to the associative technique used in the Qlik QIX engine. It is primarily the increase of unique field values that increase the RAM usage. As an  an attempt to reduce data size you could try to aggregate the data dimensions that you are interested in.

You could also make sure to remove ID fields that are not in used in the application/datamodel, round of all timestamps to dates where the exact time is of interest. You could also consider using an autonumber function on fields that are used as keys between tables in your datamodel.

Good luck.

ArnadoSandoval
Specialist II
Specialist II

Hi @razvan_brais 

Earlier today I reviewed @Vegar script, it did not work for me with the LEFT JOIN (Sample) but it did when I changed the join with OUTER JOIN (Sample); it works still requiring some adjustments (take a look at this article explaining JOINS, it has small example on each type of joins Understanding Join-Keep-and-Concatenate ) because those December-2019 clients appear on January-2020 but with some nulls (the article clearly shows the behaviour), so I added the logic to resolve those nulls.

WARNING: Any client with transactions gap between months will get a record filling the gap, I did not test the scenario when the gap is bigger than one month.

Here is my script, I preload the data into TABLE in a previous section, here is the script:

NoConcatenate 

TEMP: // <-- load the data to a temp table because of the nulls.
Load 
    CLIENT_ID,    
    LOCATION_ID,    
    SEGMENT,   
    HIST_DATE,
    No_Month,    
    MONTHNAME(MAKEDATE(Year(HIST_DATE),Month(HIST_DATE)) ) AS Period,
    SALES AS SALES
Resident TABLE;

Outer Join (TEMP)   // <---- Outer Join instead of Left Join

Load 
    CLIENT_ID,    
    LOCATION_ID,    
    SEGMENT,   
    AddMonths(HIST_DATE, 1) As HIST_DATE,
    num( month(addmonths(HIST_DATE, 1))) as Month_No,
    SALES AS PREV_SALES
Resident TABLE;

NoConcatenate

// now the script handling the nulls.
DATA:
Load
    CLIENT_ID,    
    LOCATION_ID,    
    SEGMENT,   
    HIST_DATE,
    If(IsNull(No_Month), Month(HIST_DATE), No_Month) As No_Month,  
    If(IsNull(Period), MONTHNAME(MAKEDATE(Year(HIST_DATE),Month(HIST_DATE)) ), Period) As Period,
    SALES,
    PREV_SALES
Resident TEMP;

Drop Table TEMP;

Store DATA into [lib://QVDs/Results_v2.qvd] (qvd);

Drop Table TABLE;

You can give it to this script.

The script fills all the gaps (missing months) you will end with a bigger size QVD. I do not know how large for yours, locally I got 5 extra Kb.

Hope this helps, 

Arnaldo Sandoval
A journey of a thousand miles begins with a single step.
razvan_brais
Creator III
Creator III
Author

Hy guys,

 Finally I managed to solve the issue. It helped a lot your suggestions and solution. 

 I have user @Vegar  script to get PREV_SALES. 

 The final script is the one bellow . Based on the PREV_SALES I needed to calculate variation of sales between current month and prev_month. 

 Now I don`t know what comment to mark as solution , because all of you helped me to achive to the final script 🙂

Temp_Data_1:
LOAD
    CLIENT_ID,    
    LOCATION_ID,    
    SEGMENT,   
    YEAR,            // <---- this column was introduced
    No_Month, 
    Date(MonthName(MAKEDATE(YEAR,No_Month) )) AS HIST_DATE,
    SALES 
FROM TABLE;


join(Temp_Data_1)
LOAD
    CLIENT_ID,    
    LOCATION_ID,    
    SEGMENT,   
    AddMonths(HIST_DATE,1) AS HIST_DATE,  
    SALES as PREV_SALES
Resident Temp_Data_1;

//getting max of Date from table with SALES and PREV_SALES and store it in a variable
Max_Month:
Load Max(HIST_DATE) as Max_Date
Resident Temp_Data_1;

Let vMaxMonth=Peek('Max_Date');
Drop Table Max_Month;

//loading all the data except de extra month that has been added to the table
NoConcatenate
Final:
Load CLIENT_ID,
     LOCATION_ID,
     SEGMENT,
     HIST_DATE,
     SALES,
     PREV_SALES,
     if(Len(SALES)<1,0,SALES) - if(Len(PREV_SALES)<1,0,PREV_SALES) as VARIATION
 Resident Temp_Data_1
 where Date(HIST_DATE) <Date($(vMaxMonth))
;
drop Table Temp_Data_1;

 

Thank you again for your help,

Razvan

Vegar
MVP
MVP

It's great that you finally managed to get a solution for your problem.

May I ask if you managed to hold down the QVD size?

You should mark this thread as solved for future reference. You can always mark your previous answer with your final script as the correct answer. I think you could mark up to three posts as the correct answer to a post.

ArnadoSandoval
Specialist II
Specialist II

@razvan_brais 

Fantastic, It was a teamwork effort helping you to get your issue solved.

Arnaldo Sandoval
A journey of a thousand miles begins with a single step.
razvan_brais
Creator III
Creator III
Author

@Vegar  ,  the size of the qvd is double than the initial qvd  , but  I added 20 new columns to the final table 🙂 

 Thank you all again for your help.

 #staysafe 😄