Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I'm looking to load several years worth of Sales data from CSV files into a single table, the only problem is that over the years, new fields have been added so the files containing the data vary a little. When I tried "LOAD * FROM Sales*.csv", Qlikview crashes so I'm assuming that has to do with the extra/missing data.
An example of the first table looks like this (1995):
SaleID | Date | ShopID | ItemID |
---|---|---|---|
001 | 01/01/1998 | S001 | I001 |
Where a table from 10 years ago looks like this:
SaleID | Date | ShopID | ShopZone | ItemType | ItemID |
---|---|---|---|---|---|
001 | 01/01/1998 | S001 | Z01 | Food | I001 |
And the latest table looks like this:
SaleID | Date | ShopID | ShopZone | ShopCountry | ItemType | ItemID | ItemCount | Total |
---|---|---|---|---|---|---|---|---|
001 | 01/01/1998 | S001 | Z01 | UK | Food | I001 | 2 | 2 |
These are greatly simplified. The oldest files have 6 columns and the latest ones have 59. Columns have been added over the years to gather more information but no columns have been deleted so the basic information (IDs, etc.) are present in all files.
Here it is an example where
2.xlsx , 3 and 4 have different list of fields, to concatenate all I have to use a dummy table with a fake field
Hope it helps
try reading the files in a for loop (you can find a lot of post about this)
and concatenating the files
in this way you can discover which one crash
Here it is an example where
2.xlsx , 3 and 4 have different list of fields, to concatenate all I have to use a dummy table with a fake field
Hope it helps
Thank you Alessandro, that worked a treat!