Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello guys,
I have a lot of automatic exports of data in excel.
My problem is that not all excel files have the same columns and I don't know how to load them all in an unique load script.
Here I've attached an example with 3 files with the following names :
Export - Fri Jul 24 2020 18_58_13 GMT+0200 (hora de verano de Europa central) |
Export - Sat Jan 09 2021 17_14_40 GMT+0100 (hora estándar de Europa central) |
Export - Thu Nov 05 2020 17_21_51 GMT+0100 (hora estándar de Europa central) |
The three files comes with three common columns A,B and C, and the file with year 2021 it comes always with the Column D
Currently I'm using this sentence to load all the files with the common Columns A,B,C :
Bulk_Export_Example :
LOAD
A,B,C
FROM [lib://Bulk Export Example/*.xlsx]
(ooxml, embedded labels, table is Hoja1);
How can I do it to load all the files automatically? (In my real case I have more than 100 excel files, and at the beginning of 2021 my customer added a new column, but this could change in the future).
You can find the files above.
Best regards, Marcel.
HI @marcel_olmo
Try like below
Table:
Load * Inline
[
E
];
Concatenate
LOAD *
FROM
[D:\Qlik\personnal\Export*.xlsx]
(ooxml, embedded labels, table is Hoja1);
DROP Field E;
Hope all your excel has same sheet name.
Check once If this is similar to this : Link
Thanks @Anil_Babu_Samineni for your quick response.
The problem of the Load * it creates as many copies of the table as different columns it founds, so I got Table, table-1, table-2, etc..
If i understand correctly, You can call out the "ALL" in keyword
LOAD
A,B,C
FROM [lib://Bulk Export Example/*.xlsx]
(ooxml, embedded labels, table is ALL);
Thanks for the try @Anil_Babu_Samineni in my case it didn't work 😞
HI @marcel_olmo
Try like below
Table:
Load * Inline
[
E
];
Concatenate
LOAD *
FROM
[D:\Qlik\personnal\Export*.xlsx]
(ooxml, embedded labels, table is Hoja1);
DROP Field E;
Hope all your excel has same sheet name.
Thanks @MayilVahanan this workaround worked fine.
Best regards, Marcel.