Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I have a requirement to write 100+ .qvd files to MSSQL server tables. The qvd is comprised of data from various file types that occasionally get updated or new files loaded. After the load/transform the data will need to be inserted into database tables nightly or on a schedule.
I know this can be done with scripting in the qvd or by exporting the qvd to csv (store & SQL insert) and importing to tables manually or with SSIS. My question really is to ask if there is a better method to do this? I would prefer not having to create the tables one by one and have to write all of the inserts.
Any suggestions would be appreciated.
There are ETL tools that can read the .qvd file format. You can use one of those tools to load the data from the qvd files and store it in a database. Usually you can let the ETL tool create the target tables in the target database too.
There are ETL tools that can read the .qvd file format. You can use one of those tools to load the data from the qvd files and store it in a database. Usually you can let the ETL tool create the target tables in the target database too.
I had found the ETL Processor but figured it would be faster to write the scripts than it would be to learn a new tool. I was hoping there might be another option. Either way this looks to be tedious process.
Thank you Gysbert.
Well, it's a daft requirement imho, so you'll probably end up with a daft solution too. Why were the qvd files created anyway? Why wasn't that data extracted and transformed with an etl tool and then stored in a database in the first place?
The data was from years of disparate Excel, CSV and text documents. QlikView was used to process those files and forms and the resulting data was stored in qvds. The rest is outside my circle of influence.
Yeah, not much you can do about it then. You can easily export the qvd files to csv files. Some databases (I know oracle does) support csv files as external tables. But you'd still need to define them. Perhaps a local friendly database administrator could help you out in some way.
I can do all of the loading into the DB with SQL Integration Services (SSIS) but the qvds would still need to be exported to csv as SSIS cannot read the QVD. (as far as I can know now) Having to create 100+ DB tables and writing all of the store or SQL insert statements in QV is what I was hoping to avoid. I have a 100+ qvd files to process and then potentially combine/normalize the data in the DB. It looks like ETL-Tools is the best solution as one of the post I found states the tool can now build the create table statement and once the ETL package is built the load can be scheduled.