To this I need to add some dummy fields so that when it is stored later as a QVD, it will have the same fields as other QVD's. Then in Dashboard qvw's these QVD's with identical field metadata, they can simply be concatenated together all with optimised loads.
In other QVD Generators I have done that with an MustInclude statement calling a script file that does AutoGenerate(0), but they are simply extracting from the databases and do have the joins above.
Null() as [STWT Fact Count] ,
Null() as [STWT Employee Number] ,
date(Null()) as [STWT Date] ,
Null() as [STWT Environment] ,
Null() as [STWT Business Unit],
// There are a couple of hundred more fields in here
Null() as [Shift Year]
If I place this at the beginning and concatenate the first table onto it, it takes ages and ages and ages to run.
If I place it at the end with an outer join, it improves a lot but for qvd's with a few million rows it takes a few minutes after the AutoGenerate(0). I guess it is organising the resulting table out during this time.
I wish to use a communal script called via a MustInclude statement calling the script file that does the AutoGenerate(0) so that when extra fields are needed then just changing this single script will sort all the QVD Generators.
The whole thing is encased in a loop and done multiple times for multiple source databases which are used for different geographical areas. So adding a few minutes run time per loop adds up to a significant addition to the overall run time
Any suggestion as to an efficient way to do this ?
I may well end up having to do as you suggest which will work, but it does not allow me to have a single communal qvs script file to autogenerate(0) the dummy fields and then re-use that script across multiple QVD generators.
You wouldn't happen to know what criteria are needed to ensure an optimised concatenated load ?