Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Providing Missing Records on Data Load based on Date Gaps

Hello !

Need some help to build my data load here.

The situation is similar to this one. I have a STOCKS database with 4 fields :

- STOCK TICKER (or code/name)

- STOCK DATE

- STOCK QUANTITY (# of stocks)

- STOCK VALUE (on the DATE above)

This database has many different stock codes and many dates.

I need to calculate the TOTAL value which is the sum of (STOCK QUANTITY * STOCK VALUE) per DATE. This will go into a graph of the TOTAL value per day.

The problem ? Not all STOCKS have trades every day. For the days we dont have a RECORD on the database for a particular STOCK, we should repeat the day before. Because if not, we have GAPS on the graph !

Say for instance a MICRO stock code that has records for days 18/01/2010 and 23/01/2010. Between those dates we should repeat or populate fake records with the value of 18/01/2010 (the latest one we had).

An even worse problem occurs AT THE END of the file cause I may have records for say GOOG stock code until today (26/01/2010) but I dont have records for the MICRO stock code. So I should REPEAT the latest record for MICRO from 23/01/2010 until 26/01/2010.

Got it ?

Really dont know what to do. Think some sort of FOR/NEXT looping accross the dates. But have no idea.

Thanks 4 your support !

BTW you have attached a simple plain text file where I illustrate the situation. Missing records are denoted by "###########". Those are the places where we must provide or populate a fake record.

Once more, thank U !

6 Replies
Not applicable
Author

Hi

Is it possible that you could enclose a file with an example (doesn't have to be the real data). I'm sure I have a simple solution for you but I always prefer to look first at the original data.

Not applicable
Author

Ok ! Well I have prepared an example QVW. It's a simulation, not real data of course.

On the attached file you can see that asset code # 185 does not have values for 16 and 17/01/2010.

Code # 201 also does not have values for 15, 16 and 17/01/2010. Nor for 9,10 and 11/01/2010.

Codd # 996 does not have values at the end. It ends on 21/01/2010 and most of the others end on 22/01/2010.

You can see that the TOTAL ASSET VALUE PER DATE graph is somewhat distorted due to those gaps. It's not a "real" situation graph.

Hope you can help me out !

Thanks !

Not applicable
Author

Hello

Below you have an answer for your question.

I hope I could help you 🙂

ASSETS:

LOAD * INLINE [

ASSET_CODE;ASSET_DATE; ASSET_VALUE

808;31/12/2009; 16103,09

999;05/01/2010; 09943,25

257;31/12/2009; 00094,75

996;31/12/2009; 84821,09

705;31/12/2009; 83471,40

991;31/12/2009; 82748,10

240;31/12/2009; 29651,59

999;04/01/2010; 08955,29

999;31/12/2009; 05728,62

201;31/12/2009; 57922,47

185;04/01/2010; 98149,35

808;04/01/2010; 18281,20

996;04/01/2010; 84525,72

991;04/01/2010; 84183,26

705;04/01/2010; 83437,21

201;04/01/2010; 58481,93

258;04/01/2010; 51154,28

240;04/01/2010; 30098,72

185;05/01/2010; 98518,20

808;05/01/2010; 18915,52

258;05/01/2010; 51087,71

991;05/01/2010; 84102,77

705;05/01/2010; 83573,72

996;05/01/2010; 81598,91

201;05/01/2010; 58674,84

240;05/01/2010; 30263,67

999;07/01/2010; 10977,89

185;06/01/2010; 99002,75

808;06/01/2010; 20034,17

258;06/01/2010; 51125,75

991;06/01/2010; 84960,14

705;06/01/2010; 83628,91

201;06/01/2010; 59015,60

240;06/01/2010; 30487,65

185;07/01/2010; 98916,59

808;07/01/2010; 19900,18

999;08/01/2010; 11198,35

258;07/01/2010; 51187,56

991;07/01/2010; 84522,71

705;07/01/2010; 83782,84

996;07/01/2010; 82259,32

201;07/01/2010; 59382,00

240;07/01/2010; 30209,02

185;08/01/2010; 99233,08

808;08/01/2010; 19985,00

999;11/01/2010; 11112,15

258;08/01/2010; 51201,82

991;08/01/2010; 84809,67

705;08/01/2010; 83861,27

240;08/01/2010; 30134,02

185;11/01/2010; 99132,60

808;11/01/2010; 19853,47

999;12/01/2010; 10420,46

258;11/01/2010; 51237,48

991;11/01/2010; 84841,16

705;11/01/2010; 84029,73

996;11/01/2010; 81502,26

201;08/01/2010; 59795,87

240;11/01/2010; 30112,08

258;11/01/2010; 51237,48

257;11/01/2010; 00459,88

258;31/12/2009; 51092,09

257;05/01/2010; 00249,07

257;06/01/2010; 00335,87

257;07/01/2010; 00360,67

257;08/01/2010; 00391,67

185;12/01/2010; 99069,58

808;12/01/2010; 20615,63

999;13/01/2010; 11131,15

257;12/01/2010; 00428,88

991;12/01/2010; 84802,67

705;12/01/2010; 84009,39

996;12/01/2010; 81061,05

201;12/01/2010; 60425,65

258;12/01/2010; 51213,71

240;12/01/2010; 29585,72

185;13/01/2010; 98758,42

808;13/01/2010; 20818,46

257;13/01/2010; 00497,08

991;13/01/2010; 84655,69

705;13/01/2010; 84064,58

201;13/01/2010; 60333,57

258;13/01/2010; 51273,14

240;13/01/2010; 29641,17

999;14/01/2010; 10998,76

185;14/01/2010; 98414,84

808;14/01/2010; 19413,38

999;15/01/2010; 09797,85

257;14/01/2010; 00521,88

991;14/01/2010; 84393,23

705;14/01/2010; 83991,97

996;14/01/2010; 81149,25

201;14/01/2010; 60448,31

258;14/01/2010; 51313,56

240;14/01/2010; 29283,35

185;15/01/2010; 98354,40

808;15/01/2010; 18073,45

999;18/01/2010; 10540,00

257;15/01/2010; 00459,88

991;15/01/2010; 83584,85

996;15/01/2010; 78739,16

258;15/01/2010; 51199,45

240;15/01/2010; 28886,44

185;18/01/2010; 99048,88

808;18/01/2010; 18480,35

705;18/01/2010; 84166,24

991;18/01/2010; 83808,82

996;18/01/2010; 76960,00

201;18/01/2010; 60478,39

258;18/01/2010; 51251,75

240;18/01/2010; 29083,10

999;20/01/2010; 08472,87

185;19/01/2010; 99436,46

808;19/01/2010; 19750,21

257;19/01/2010; 00565,28

991;19/01/2010; 84396,73

705;19/01/2010; 84145,91

996;19/01/2010; 76262,47

201;19/01/2010; 60730,10

258;19/01/2010; 51382,50

240;19/01/2010; 29354,96

185;20/01/2010; 98706,97

808;20/01/2010; 17327,27

999;21/01/2010; 05362,78

257;20/01/2010; 00571,48

705;20/01/2010; 84177,85

991;20/01/2010; 83105,42

996;20/01/2010; 71853,36

258;20/01/2010; 51394,38

240;20/01/2010; 28753,61

185;21/01/2010; 97568,89

808;21/01/2010; 14443,35

999;22/01/2010; 05060,75

257;21/01/2010; 00571,48

705;21/01/2010; 84000,68

991;21/01/2010; 81432,68

996;21/01/2010; 72437,33

201;21/01/2010; 60081,97

258;21/01/2010; 51346,84

240;21/01/2010; 28214,68

185;22/01/2010; 97306,14

808;22/01/2010; 14336,41

257;22/01/2010; 00577,68

705;22/01/2010; 83974,54

991;22/01/2010; 81579,66

201;22/01/2010; 60317,86

258;22/01/2010; 51399,14

] (DELIMITER IS ';');

//*********************************************************************************************************************

//First of all we must load the list of all codes (it must be done using DISTINCT method, so the codes won't be repeated)

Qualify *; //Qualify - means, that on ther begining of every field name QlikView will put the name of a table, in this case CODES. (see the result)

CODES:

LOAD DISTINCT

ASSET_CODE

RESIDENT ASSETS;

MIN_MAX:

LOAD

Min(Num(Date(ASSET_DATE, 'DD/MM/YYYY'))) AS MIN, //Here we check what is the first date, when the asset was present (we must convert the date into number first)

Max(Num(Date(ASSET_DATE, 'DD/MM/YYYY'))) AS MAX //Here we check what is the last date, when the asset was present (we must convert the date into number first)

RESIDENT ASSETS;

Let min = PEEK('MIN_MAX.MIN', 0, MIN_MAX); //Here we download the data from table MIN_MAX

Let max = PEEK('MIN_MAX.MAX', 0, MIN_MAX); // ------//-----------------------------------

DROP TABLE MIN_MAX;

Unqualify *; // Unqualify - stops the adding the table name on the begining of each field name

//*****************************************************************************************************************************

//We must download the assets fro each ASSET_CODE for each possible ASSET_DATE (between First and Last ASSET_DATE)

Let a = $(min);

do

//Here we must check when was the last asset before certain date (variable $(a))

DATA_temp:

LOAD

ASSET_CODE AS ASSET_CODE_temp,

Max(Num(Date(ASSET_DATE, 'DD/MM/YYYY'))) AS ASSET_DATE_temp_1

RESIDENT ASSETS

WHERE

Num(Date(ASSET_DATE, 'DD/MM/YYYY')) <= $(a)

GROUP BY

ASSET_CODE;

LEFT JOIN

//When we have the last date, when the asset was present, then we can look for it in the table ASSETS

LOAD

ASSET_CODE AS ASSET_CODE_temp,

Num(Date(ASSET_DATE, 'DD/MM/YYYY')) AS ASSET_DATE_temp_1,

ASSET_VALUE AS ASSET_VALUE_temp

RESIDENT ASSETS

WHERE Num(Date(ASSET_DATE, 'DD/MM/YYYY')) <= $(a);

RIGHT JOIN

//Now we can can add the list of all ASSET_CODES

LOAD

CODES.ASSET_CODE AS ASSET_CODE_temp,

($(a)) AS ASSET_DATE_temp

RESIDENT CODES;

DATA:

LOAD

ASSET_CODE_temp AS ASSET_CODE_final,

Date(ASSET_DATE_temp, 'DD/MM/YYYY') AS ASSET_DATE_final,

If(IsNull(ASSET_VALUE_temp), 0, ASSET_VALUE_temp) AS ASSET_VALUE_final //If there was no asset on the beggining (for example you will have new ASSET CODE, which never existed before)

RESIDENT DATA_temp; //then the first ASSET_VALUE is zero (see ASSET_CODE no. 185)

DROP TABLE DATA_temp;

exit do when a = max

Let a=a+1;

loop

DROP TABLE CODES;

// If you use the section below, you will choose only dates, which exist in table ASSETS, otherwise you will get all dates between First date and Last date (example 01/01/2010 which is not

// present in table ASSETS). But this is you decision

INNER JOIN

LOAD DISTINCT

Date(ASSET_DATE, 'DD/MM/YYYY') AS ASSET_DATE_final

RESIDENT ASSETS;



Not applicable
Author

Oh My God !

I SHALL take some DAYS to understand what you've done !

Thank you so much !

There's a lot of stuff in there that I dont know .

Thank you again !

Not applicable
Author

I wanted to attach a file but there was some technical problems with QlikView website.

If you want to get a .qvw file just send me an e-mail.

My address is: pauleinigh@o2.pl

I'm glad to hear that you can learn something from me 🙂

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

The Qlikview Cookbook example titled "Fill values in a data range using previous values." is a pretty close match to your problem.

You can get the Cookbook from:
http://robwunderlich.com/Download.html

-Rob