Data: LOAD * INLINE [File]; // Dummy Table to enable Concatenation
FOR EACH sFile IN FILELIST ('MyFileList_w_*')
Data: CONCATENATE (Data) LOAD
..... // here come a brilliant script and transformation of data
FILENAME() AS File // just keep a reference
(XmlSimple, Table is [.....]);
STORE Data INTO Data.qvd (QVD);
I suggest rather if anyhow possible to avoid such approach and to load these data from a database because loading about 180 M records from "complex" text-files - xml is nearly the slowest possible file-format and a "normal" csv would be loaded a lot faster - will take some time ... especially if the data are distributed to so many files. If each load takes 0.5 seconds on overhead to establish the load-statement it would take 2.5 hours by 18 K files without loading, transforming and storing the data.
Further I think the implementation of an incrmental load approach would be quite useful. Here you will find various links to this topic: Advanced topics for creating a qlik datamodel.