Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi all,
I have a scenario whereby the transactions is exported to a text file on a daily basis due to the volume of transaction lines.
I am trying to create a for each loop whereby I can find all the text files and load these files into the transaction qvd.
The file name format is tran_dd_mm_yyyy.txt. So today's file will be tran_06_09_2011.txt
I was thinking of using the filebasename to determine the filename to load but my file name is dynamic, is their anybody who can maybe help.
Regards
Jimmy
Hi.
You should save the file name in a variable, for example:
let sNom = '01.01.2012'
and then do something like the following:
STORE Data INTO $(sNom).qvd;
Hope this help!
Regards.
Hi,
I am doing the following in order to load and store dynamically:
SET FilePath='*.csv';
set vFileName = 'file';
for each File in filelist (C:\Users\innas\Desktop\Qvd\QV_new\daily\ &'\'&'$(FilePath)')
SET sFile = '$(File)';
Directory;
NetworkDomainDailyData:
LOAD *
FROM
[$(sFile)]
(fix, codepage is 1252);
store * from NetworkDomainDailyData into C:\Users\innas\Desktop\Qvd\Sql_Data\$(vFileName).qvd(qvd);
drop table NetworkDomainDailyData;
next File
exit script;
and it does not load/store anything.
Where is my mistake?
thanks a lot.
Hi.
Try the attached example.
Regards.
Hi Sandro,
still does bot load anything.
What else can be the reason?
Hi.
Please, upload your QlikView file for review.
Regards.
Hi,
here it is.
Thanks a lot.
Hi.
Sorry, you could upload a csv file you are trying to read.
Or an example with unreal data.
Regards.
Hi ,
here is the file csv.
Hello again.
See the attachment.
You must change the directory:
call DoDir ('C:\Users\spividori\Desktop\QView\forum\LeerArchivos'); for yours
and it should work.
The error was:
(ooxml, embedded labels, table is Hoja1);
this line is characteristic for each file type. You were trying to read a csv file with sentences for excel files.
I hope you understand me, my English is poor.
Regards.
Hi,
I managed with it.
Thanks a lot!!!!