Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

How do I automate the conversion of my data into QVD format?

Hi, I'm new to Qlikview and was wondering if anyone would kindly share some scripting with me:

I'm trying to automate the conversion of CSV files into QVD format, but only need to convert the new files that get added to the folder daily and not the files that have already been converted previously. How do I go about doing this?

13 Replies
ekech_infomotio
Partner - Creator II
Partner - Creator II

Hi,

1. initially load all your files

2. move your files after loading via

execute cmd /c move /Y "yourAlreadyLoadedFile" "where\you\want\to\have\yourAlreadyLoadedFile"

or delete it with

execute cmd /c del /q "yourAlreadyImportedFile"

that's all

regards,

Edgar

ekech_infomotio
Partner - Creator II
Partner - Creator II

... OK, perhaps it was a bit too short:

Day 1

load your file(s) from your import-folder

move them to another folder

write QVD

Day 2

load your (newly written) file(s) from your import-folder

load your previously written QVD

move your file(s)

write QVD (overwrite old one)

Day 3

like Day 2 😉

Everything else depends on the type and naming of your files, your permissions to move or delete them and your strategy regarding your history-QVDs

regards,

Edgar

shane_spencer
Specialist
Specialist

I need to do something similar (and I'm also new to Qlikview).

I've got many "new/modified" CSV files (from Windows Perfmon) being copied to a central share daily (actually hourly but I could reduce it).

If I move or delete these files then the schedules jobs I've got set up (outside of Qlikview) to move new/modified files will just copy them back again.

Therefore is there a way to just load the new / modified CSV's in to the existing QVD without moving / deleting them?

Not applicable
Author

I found that the answer to this thread helped:

http://community.qlik.com/thread/48633

But it didn't work very quickly if you have a very large amount of data.

shane_spencer
Specialist
Specialist

That looks like it would interrogate each file and compare the date in the file with the last modify date, with the number of files I've got it would take a long time, and get progressively slower.

Even if I create a QVD for each month it will still be slow, I guess no quicker than doing a full reload.

Not applicable
Author

Yes, that is exactly what it does.

But creating a QVD and uploading the QVD instead of the original files uploads and reloads a LOT faster!

In the end, I simply create QVD files at the end of each month manually, reload those, and then just upload the latest data normally. I find that works a lot quicker, but unfortunately nothing automated about it.

There is a partial reload function in Qlikview that might be of use to you but I don't know how to work it. Also you could write a script that reloads your data at some time every evening whilst you're not at work and that way it doesn't matter how long it takes.

Sorry I couldn't be more help. Please let me know if you figure it out!

shane_spencer
Specialist
Specialist

Well as my perfmon logs stop and start at midnight it would make sense to reload shortly after then, so I guess a long load time isn't a big problem. And to make things easier I could make sure the logs are copied over to the central location only once at the end of the day before a new log is created so I'm not uploading partial files.

All the files have got a date stamp in the name. i.e. servername_yyyymmdd.csv

And I'm contemplating creating a new QVD for each month so I'd have a max of 31 days of CSV files to load.

(I've got a seperate QlikView that loads all the QVD's to a nice front end)

I've already got the code to load only certain files based on the name - this loads all of Aprils:

SET search='*_201204*';

for each dir in DirList('\\hbeu.adroot.hsbc\dfsroot\gb002\CBI_SUPP_ENV\GFA\*')

for each csvfile in FileList('$(dir)\$(search).csv')

Perhaps I could replace the fixed date, i.e. 201204 for April, with something based on current date (actually the previous days date). That way I could do my initial load to build up my historic data manually, then just add new CSV files to the current months QVD.

Not sure how I'd code that though.

shane_spencer
Specialist
Specialist

Because my files had a date stamp I was able to pick the previous days files:

Let yesterdate = Date(Today()-1, 'YYYYMMDD');

Let yestermonth = Date(Today()-1, 'YYYYMM');

SET search='*_$(yesterdate)*';

data:

LOAD * FROM Part_processed_C_$(yestermonth).qvd (qvd);

for each dir in DirList('\\hbeu.adroot.hsbc\dfsroot\gb002\CBI_SUPP_ENV\GFA\ePlanning\*')

for each csvfile in FileList('$(dir)\$(search).csv')

CROSSTABLE(Metric, value,2)

LOAD date,'$(csvfile)' as file,* FROM $(csvfile) (ansi, txt, delimiter is ',', embedded labels, msq, filters(Top(1, 'date')));

next csvfile

next dir

store data into Part_processed_C_$(yestermonth).qvd;

drop table data;

n.b. I could have replaced the "search" with the "yesterdate" variable but have kept it in case I need to do some manual loads.

Not applicable
Author

Nice one, thanks!