This field is required.
Only these extensions are allowed(.jpg, .JPG, .jpeg, .JPEG, .gif, .GIF, .png, .PNG)
Tags cannot contain the characters ' /, \\, #, ?, or ; >,< '
Only these extensions are allowed(.zip,.ZIP,.pdf,.PDF,.qvf,.QVF,.qvw,.QVW)
QlikView documentation and resources.
Its a tedious task to generate Qvd's while developing a Qv application.
Here's a Qvd Generator to ease the life of Qv Developer.
Just edit the script, making changes to the OLE DB / ODBC connection, & schema name.
Change the path where the Qvds need to be saved, exit the script.
Then just click the "Generate Qvd's" text in white.
The Qvd's will be generated at the mentioned path.
Hope this will ease someones efforts. !
Thanks & Regards
Great job on this. The only thing people should be aware of is the size of the tables. There is no restriction on the data pulled so you could have some very long running and large QVDs.
In a data warehouse environment where all tables have the same column like "LastModDate" this could be added to your script easily.
If not it would take a bit of work but loading an include file with the tables you were concerned about size limits could be pulled in with the "LimitTableName" and "LimitDateColumn" you wish to restrict on.
The tricky part would be the loop or possible exists to tie the first part of the IF statement and then pulling that LimitTableName's LimitDateColumn.
Below is all theoretical but gives something to think about.
Let vDataStartDate = YearStart(today(), -1);
Let vLimitTable = Loop through Include table to pull LimitTableName where it equals the table currently being loaded.
Let vLimitDate = Same as above but pulls the date column name.
/***********[ Date Limit ]************/
IF $(vTableName) = $(vLimitTable ) = THEN
LET vWhereClause = 'WHERE $(vLimitDate ) >=' & Chr(39) & vDataStartDt & Chr(39);
/***********[ Complete Table ]************/
LET vWhereClause = 'WHERE 1=1;
SELECT * FROM $(vSchema).$(vTableName)
Something to think about anyway.
Thanks for the hard work.
very helpful thanks
@Gowtham Kesavan Glad to hear that
We have been using something similar to this for a while. Multistage ETL
Multistage ETL is that a tool ?
Just a concept. Rather than extracting data directly to our QVW files, we move everything to QVD files first. In our world, most of what we import is from SQL, but we also occasionally import from Txt files, Excel files and even other QVD's. Often we perform the T - transformation at the SQL level, but sometimes it is quicker to change the format of a date or name of a field in the second layer of the load. We also have very large QVD files, so we also created a reducer layer of the load process so that we can create a test or working environment. If I loaded all of the data in my primary QVW, the file would be 14 GB, which is very hard to work with when testing. The reducer logic brings the file down to a manageable 1.5 GB size. Our main app has 30 QVD files to load it. The Reducer is just another QVW that batch loads data from all of my QVD files and uses a where statement to filter by a dimension, such as a date or min quantity, then it writes back to the QVD with the reduced dataset.
@Mike Czerwonky Yep the reducer concept, sounds good!
I work in R&D where we deal with dummy data, I havent have had exposure to humungous data as yet!
But will have your advises as a backup when I shall encounter such situations.
Thanks for sharing this info!
Thank you very much .
Welcome @Hirish V
Glad that you found it helpful
Don't use qlik for QVD auto generation.
It's a heavy process, Windows platform only and not Thread allowed (on loop cycle) ....