Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi there,
Trying to figure out a way where I can create a loop that runs based on the number of days in a month and stores each days worth of data from an existing query. Here is my syntax:
SELECT
B.BRANCHID,
SC.SCANID,
SC.VERSION,
SC.DRIVER,
SC.SCANTYPE,
SC.TIME,
SC.LONGITUDE,
SC.LATITUDE,
SC.EXCEPTION,
SC.GOODSID,
SC.LOCATION,
SC.BARCODE,
SC.CUSTOMERID,
SC.CANCELTIME,
SC.PALLET,
SC.ORDERDATE,
SC.PROPERTY,
SC.TIMEZONEID,
SC.SCANCUSTOMERGROUPID
FROM
DECSPRD.ORDERS O,
DECSPRD.BRANCH B,
DECSPRD.CUSTOMER C,
DECSPRD.STOP S,
DECSPRD.GOODS G,
DECSPRD.SCAN SC
WHERE
O.ORDERREADYTIME BETWEEN TO_DATE('$(vCurrentMonthStart)','MM/DD/YYYY')-1 AND TO_DATE('$(vCurrentMonthEnd)','MM/DD/YYYY')+1 AND
TRUNC(TIMEZONED(O.ORDERREADYTIME, B.TIMEZONEID))
BETWEEN TO_DATE('$(vCurrentMonthStart)', 'MM/DD/YYYY') AND
TO_DATE('$(vCurrentMonthEnd)', 'MM/DD/YYYY') AND
O.ORDERID = S.ORDERID AND
S.STOPID = G.DESTINATIONSTOP AND
G.GOODSID = SC.GOODSID AND
C.BRANCH = B.BRANCHID AND
O.CUSTOMER = C.CUSTOMERID;
STORE q_scan INTO [$(vDirQVD)$(vQVDPrefix)DECSPRD.SCAN.QVD] (qvd);
DROP TABLE q_scan;
I think you have to loop
- You can extract all data with a single select on the db and then create a qlik table for every day
- Or you can run a select for every day of your date range
I tried with the second one; this seems to work in my environment (oracle DD/MM/YYYY, qli k DD/MM/YYYY) but it depends on db, date settings, etc.....
Hope it helps
// conn to db
OLEDB CONNECT32 TO [Provider=OraOLEDB.Oracle.1;Persist Security Info=True;User ID=xxxx;Data Source=xxxx;Extended Properties=""] (XPassword is xxxxxxxxx);
// yearmonth to extract
//
let vYearMonth = year(today()) & num(month(today()), '00'); // YYYYMM
trace $(vYearMonth);
//
// distinct date for that year month
//
date:
LOAD distinct date(floor(DTA_INIZIO)) as dta_inizio;
SQL SELECT *
FROM "GRP_CLI"."GRP_LOG"
where to_char(DTA_INIZIO, 'YYYYMM') = '$(vYearMonth)' // same formatin oracle and qv variable
;
// loop on distinct date: 1 select for every day
// read and store
//
let vRows=NoOfRows('date');
for i=0 to $(vRows)-1
let vDate=peek('dta_inizio', $(i));
trace $(vDate) ----- $(i);
table:
LOAD *;
SQL SELECT *
FROM "GRP_CLI"."GRP_LOG"
where trunc(DTA_INIZIO) = to_date('$(vDate)', 'DD/MM/YYYY')
;
let vDate2=replace('$(vDate)', '/', '_');
trace $(vDate2);
store table into table_$(vDate2).qvd (qvd);
DROP Table table;
next;
Thank you Massimo. I will try to experiment with this concept. For context, let me explain a little more. This query is a small section from a much bigger section. The issue we are running across is that of size because we are having to pull a huge amount of historical data. Would it be better to store the historical data from this query into a monthly qvd so that the for loop can help the query pull more efficiently? How would I go about doing that? Thank you for responding to my question.
id depends on so many things that is difficult to answer:
- qvd: I never used qvd by date. When I have qvd >= some GB I prefer read and store qvd by month; in this way I have all months on file system, I can read from source the months changed (well, it depends....), I can read last n years when I reload the user doc
- what is huge amount? did you try to store one month in a qvd?
- database: your dba should know if one query or n queries (by month, by date) is better; load on db and elapsed time can change a lot
- this should be by month
let vYearMonth = year(today()) & num(month(today()), '00');
trace $(vYearMonth);
// distinct YYYYMM, from db, few records
date:
LOAD DTA_INIZIO as dta_inizio;
SQL SELECT distinct to_char(DAT_INIZIO_JOB, 'YYYYMM') as DTA_INIZIO
FROM "STG_CODE_ETL"
where to_char(DAT_INIZIO_JOB, 'YYYYMM') <= '$(vYearMonth)'
;
let vRows=NoOfRows('date');
for i=0 to 2 //$(vRows)-1 // 2 = test
let vDate=peek('dta_inizio', $(i));
trace $(vDate) ----- $(i);
table:
LOAD *;
SQL SELECT *
FROM "STG_CODE_ETL"
where to_char(DAT_INIZIO_JOB, 'YYYYMM') = $(vDate);
store table into table_$(vDate).qvd (qvd);
DROP Table table;
next;