Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Dealing with huge database


Hi,

I am currently working on Qlikview Personal Edition with huge amount of data. I am trying to fetch 4columns out of 11, from a table containing 70 million records and trying store it into a .qvd file. When I am reloading it for the first time, its taking too much time to reload. I have tried it, after even 2hours, it loads only 5million records. My question is, is it taking so much time to reload coz of personal edition I am using?? or have I missed on anything??  Is there any best practices needed to be followd. Please suggest some ways to handle it.

Thanx in advance.


3 Replies
Not applicable
Author

Hi,

May not have enough memory to load all records. Already monitored?

Rebeca

Clever_Anjos
Employee
Employee

Instead of retrieving the whole table, it would be better choosing a date field from your table and split your qvd´s into monthly or even daily

These topics shows the same issue:

http://community.qlik.com/message/400808#400808

http://community.qlik.com/message/108338#108338

Not applicable
Author

Hi Sakir,

This has nothing to do with you Personal Edition.

The speed depends on your connection to the source (internet speed, how far you are from the database) and the driver (ODBC,  OLE DB). I can see in your Load Script that you are using an ODBC driver. If possible, try using an OLE DB driver. You need to check this with the database vendor. They could be providing an OLE DB that makes it much faster!

Please do also consider the suggestion from Clever Anjos. If you can't get better speed, then you have no other option than to wait for the reload to complete or reduce/split into several QVDs. In this case, if you try create multiple QVDs from one load script, drop each table that is stored into QVD before loading the next table. This will free up memory before you start extracting the next set of the data.

To drop table, use the following script:

drop table TableName;