Discussion Board for collaboration related to QlikView App Development.
I am concerned to know the best way and minimum hardware configuration needed to deploy a QVW file of size 1TB or 500GB.
I came across many threads for this but not able to get the clear picture.
Below are the threads and are really helpful
Re: Large amount of data (120 million rows) ... anyone got experience?
My next question if from the point of developers. How to work with a file having 100 or 200 million records loaded in Qlik desktop.
Many time i tried to open a single table with few columns and 200 millions of rows and msg pops up "Out of Memory".
Well my local machine is 64 bit with 4GB RAM.
The problem is related to objects not only to processor and Ram (I think it's enough in your case)
Trying to load 200M rows in a table is not useful, limit the visualization and preset filters. Who looks as 200M rows?
If there is such huge data, I think opening directly in server was preferable because local systems were not that much capable to open such huge dashboards
As soon as reload is done with 200 million rows the document gives an error "Out of Memory" even not able to save it .
That mean for verifying each time i need to deploy the files and cross check .
It seems crazy to me .
No need to publish, we can open the files from the server where we mount the folders and copy the qvw files to publish