Qlik Community

Ask a Question

App Development

Discussion board where members can learn more about Qlik Sense App Development and Usage.

Announcements
April 22, 2PM EST: Learn about GeoOperations in Qlik Sense SaaS READ MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
kdaniels-obrien
Partner
Partner

Create separate .qvd files to eventually concatenate

Hi, 

I am looking to use one qvd file to store 12 hours of data. Then at the end of 12 hours, I want to concatenate that data with another qvd file that serves as a historical data file. I'm hoping this will speed up my load time.  Can someone help me get started with this process?  How do I specify when the 12 hour data should be moved to the historical data? What are some key commands/functions others have used when doing this?

 

Thank you! 

Labels (4)
1 Solution

Accepted Solutions
Vegar
Partner
Partner

I think this might be a more robust solution if you are running this script multiple times a day.

MaxTimeTable:
LOAD max(Time) as MaxTime
FROM  LIB://.../AllTime.qvd (qvd);

LET vMaxTime = peek('MaxTime', -1, 'MaxTimeTable'); //Latest transaction time

Data:
LOAD *
From LIB://.../24Hours.qvd (qvd)
Where Time > $(vMaxTime);

Concatenate (Data)
LOAD *
From LIB://.../AllTime.qvd (qvd)
;

Store Data into LIB://.../AllTime.qvd (qvd);

 

You could take a look at this YouTube film: https://www.youtube.com/watch?v=GLO2UteiHQ8

Plees ekskuse my Swenglish and or Norweglish spelling misstakes

View solution in original post

3 Replies
Vegar
Partner
Partner

I think you could be able to speed up your load. Important when doing this is to keep track on which transactions are sent to the historical qvd and which are not. Timestamps and transaction IDs are both good indicators on which data is loaded in the QVD.

I suggest you start reading up on a topic called incremental load. You will find a lot of post on incremental loads in this community, You will find a lot on the topic outside of the Qlik sphere as well, but I reccomend you stick to the Qlik related postings to begin with. 

 

Good luck.

Plees ekskuse my Swenglish and or Norweglish spelling misstakes
kdaniels-obrien
Partner
Partner
Author

Would something like this in my load script achieve what I described in the post?

Earlier in my load script, I get the MaxTime from the last load... I want to collect data all day long and then when it is a new day, move those records from the last 24 hr qvd  to the other qvd file that has all time data. 

if date(floor(MaxTime)) <> date(floor(Today())) then
[Last 24 Hours]:
LOAD *
from 'lib://..../24Hours.qvd' (qvd);
concatenate([Last 24 Hours])
LOAD *
from 'lib://.../AllTime.qvd' (qvd);
DROP TABLE [Last 24 Hours];

Vegar
Partner
Partner

I think this might be a more robust solution if you are running this script multiple times a day.

MaxTimeTable:
LOAD max(Time) as MaxTime
FROM  LIB://.../AllTime.qvd (qvd);

LET vMaxTime = peek('MaxTime', -1, 'MaxTimeTable'); //Latest transaction time

Data:
LOAD *
From LIB://.../24Hours.qvd (qvd)
Where Time > $(vMaxTime);

Concatenate (Data)
LOAD *
From LIB://.../AllTime.qvd (qvd)
;

Store Data into LIB://.../AllTime.qvd (qvd);

 

You could take a look at this YouTube film: https://www.youtube.com/watch?v=GLO2UteiHQ8

Plees ekskuse my Swenglish and or Norweglish spelling misstakes

View solution in original post