Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us to spark ideas for how to put the latest capabilities into action. Register here!
cancel
Showing results for 
Search instead for 
Did you mean: 
anushree1
Specialist II
Specialist II

Huge Data Load

Hi,

I am trying to load a table of 206 Columns having 16 million records from SAS through Qlikview Desktop.

Its seen that the first 3 Million records get loaded within the 3 Mins but next 1 million does not load even after 2 hrs and qlikview gets shut down , apart from the RAM and disk space could there be any issue causing this trouble

14 Replies
avinashelite

all the 206 columns from the same table? if yes then its taking long time to load the records and QV is getting freeze . Its the issue with the RAM only .

Try like this try to load the data Year wise or by applying the filter and create QVD;s for the same ..so that you could load the data easily 

anushree1
Specialist II
Specialist II
Author

yes all the 206 columns are from same table

Loading them to different qvds would not serve the purpose as i must concatenate them in stage 2 anyways

Sergey_Shuklin
Specialist
Specialist

Hi, Anushree!

Is there any way to divide this query on the several smalest? I mean, can you use a filter or something which will separate your huge data on parts? Because a 3 billion cells is a real task even for a QV!

anushree1
Specialist II
Specialist II
Author

Hi Sergey,

Its in million not billion I did read that Qv easily loads millions and millions of data

avinashelite

Yes it will , how much RAM your System has ? and are you loading any other tables along with this table? any joins ?

anushree1
Specialist II
Specialist II
Author

Hi Avinash,

I did run through the Server it took around 30 Mins, the size of qvd generated  is 12 GB.

But for making changes in application like creating new objects  I will have to work on my local which has 8 GB RAM , so I doubt that the application would work in case Load the 12 GB qvd into it.

So do you have any suggestions or workaround on this?

avinashelite

OK so issue is with your local RAM .

Try like this

Load first , only 1 or 10 K records to your app and add all the objects and UI elements to the app then finally push the app to Server and reload the complete data so this way you could open the app in local machine and once the data load is complete in server you will get the complete data too

Hope this helps you to resolve the issue

Peter_Cammaert
Partner - Champion III
Partner - Champion III

Development on your server would be a workaround (every Windows Server includes two RDP connections) when resources are tight everywhere else. However, you may now run the risk of consuming all server resources (RAM & CPU) when you make a mistake in your load script. Your end-users and AccessPoint visitors may not like that.

An alternative is to upgrade your developoment machine to 16GB of RAM. Most laptops/desktops can be upgraded to that amount (more RAM may pose a problem in older laptops)

A third approach is to eliminate all columns from your huge data files that aren't used anyway, or to aggregate everything that will be used in aggregated form only.

anushree1
Specialist II
Specialist II
Author

Hi Peter,

could you please state what should be the RAM size ideally to enable smooth working of Dashboard having Millions data