Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi everybody,
Do you know how can I join different files (.csv) in the same QVD?
[Extraccion]:
Load
([ID Orden],
([Telefono Acceso], ...
FileName() as "Fichero"
FROM [lib://Extraccion/??????_Extraccion_completa_provision.csv]
(txt, codepage is 1252, embedded labels, delimiter is ';', no quotes);
Store [Extraccion_Completa_Provision] into [lib://Provision_Escritorio/Extraccion_completa_provision.qvd](qvd);
Drop table [Extraccion];
Note: [lib://Extraccion/??????_Extraccion_completa_provision.csv] --> I download every day a file with the current date, and I have to keep it in a folder where I delete the previous day file.
Thank you
Hi,
Today, create a QVD from your file using :
Extraccion:
LOAD * ,
Filename() as Fichero
FROM [lib://Extraccion/*_Extraccion_completa_provision.csv]
(txt, codepage is 1252, embedded labels, delimiter is ';', no quotes);
Store [Extraccion_Completa_Provision] into [lib://Provision_Escritorio/Extraccion_completa_provision.qvd](qvd);
Drop table [Extraccion];
Then, you can concatenate new data from csv to your QVD and create a new one :
// Load the QVD
Extraccion:
LOAD *
FROM [lib://Provision_Escritorio/Extraccion_completa_provision.qvd](qvd);
// concatenate new data
concatenate(Extraccion)
LOAD * ,
Filename() as Fichero
FROM [lib://Extraccion/*_Extraccion_completa_provision.csv]
(txt, codepage is 1252, embedded labels, delimiter is ';', no quotes);
// replace the old qvd with the new one in the folder
Store Extraccion into [lib://Provision_Escritorio/Extraccion_completa_provision.qvd](qvd);
Drop table [Extraccion];
Hi Karen,
Does your latest file contains new records only? Or does it includes previous records also?
Hi Gian,
In every file there are only new information.
Thnaks
Ok, so my script should work well for your need
(I've edit the final Store)
Christophe's script should be fine. but you can also use below. just eliminate creation of another QVD.
Extraccion:
LOAD * ,
Filename() as Fichero
FROM [lib://Extraccion/*_Extraccion_completa_provision.csv]
(txt, codepage is 1252, embedded labels, delimiter is ';', no quotes);
Store [Extraccion_Completa_Provision] into [lib://Provision_Escritorio/Extraccion_completa_provision.qvd](qvd);
Drop table [Extraccion];
Then, you can concatenate new data from csv to your QVD and create a new one :
// Load the QVD
Extraccion:
LOAD *
FROM [lib://Provision_Escritorio/Extraccion_completa_provision.qvd](qvd);
// concatenate new data
concatenate(Extraccion)
LOAD * ,
Filename() as Fichero
FROM [lib://Extraccion/*_Extraccion_completa_provision.csv]
(txt, codepage is 1252, embedded labels, delimiter is ';', no quotes);
// replace the old qvd with the new one in the folder
Store [Extraccion_Completa_Provision] into [lib://Provision_Escritorio/Extraccion_completa_provision.qvd](qvd);
Drop table [Extraccion];
I'm not sure if I did it well, but the previous information is deleted. I can chek it because of the colum "Fichero", which contains: "180606_Extraccion.csv"
I've tried with files of JUN05 and JUN06.
Thanks
First you need to do an initial load like this.
[Extraccion]:
Load
([ID Orden],
([Telefono Acceso], ...
FileName() as "Fichero"
FROM [lib://Extraccion/??????_Extraccion_completa_provision.csv] //<== For example, your previous data
(txt, codepage is 1252, embedded labels, delimiter is ';', no quotes);
Store [Extraccion] into [lib://Provision_Escritorio/Extraccion_completa_provision.qvd](qvd);
Drop table [Extraccion];
After that do an incremental load like below.
[Incremental]:
Load
([ID Orden],
([Telefono Acceso], ...
FileName() as "Fichero"
FROM [lib://Extraccion/??????_Extraccion_completa_provision.csv] //<== Your latest data
(txt, codepage is 1252, embedded labels, delimiter is ';', no quotes);
Concatenate
Load *
From [lib://Provision_Escritorio/Extraccion_completa_provision.qvd] (qvd)
Where Not Exists([ID Orden]);
Store [Incremental] into [lib://Provision_Escritorio/Extraccion_completa_provision.qvd];
Drop table [Incremental];
Let us know if this helps.
Thanks,
Please mark as answered if this solved your issue,
Thanks.