Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
Can anyone help me with this? much appreciated
I have a Qlikview desktop version installed on a virtual server, windows server 2012 R2 with 10 vcpu 2.7GHZ and 64G RAM
QVW file is about 1.3G with 800 millions records, only a 4 tabs and a few tables and charts.
I assume the server should be capable enough, nothing else is running on the server. Only one user is accessing the server.
But it takes 4-5 minutes to just open the dashboard, and takes long time to save it as well. CPU usage is around 90-95% when opening the file, Memory usage is up to 16GB.
Any ideas guys?
Hi xi ci,
Did you store your data in qvd.Format?
Hi,
If you have no disk space lack - try to change Compression settings to None.
Document Properties - > General - > Compression.
By default it set to High -> that's why you have 90-95% CPU consumption while opening the file - QlikView Engine needs to decompress qvw.
The same with Saving qvw file - QlikView engine needs to compress data... So if disk space is not a concern - turn off compression.
Hope this helps.
Regards,
Andrei
Thanks Andrei, so if I turn off compression, what sort of size of the file
are we talking about? The original file is about 1.3GB.
Also do you think Qlikview can handle 100 million rows in a table / chart?
Not more than you can see in RAM usage. It's really vary because of data quality (types, number of distinct values etc.)
QlikView can handle 100M+ fact tables but you should carefully work with such amount of data because every trifle worth gold...
Do you want to show all 100M rows in single table? or you're interested in aggregations on 100M+? You can use drill-down technique to view overall aggregated data and then drill-down to details in another Tab/Sheet/App if needed...
Regards,
Andrei
yes, 100m rows in single table / chart, just simple straight table
yes, in qvd file
hm... you can try it's really depends on data itself. From my experience I've never met a person who can handle 100M+ records without narrowing down the data set...
/Andrei
>>yes, 100m rows in single table / chart, just simple straight table
Well that will take a really long time to create and require a massive amount of memory...
I always limit tables to no more than 100k rows for really large tables and often even less with conditional calculation. As AK said, there are not a lot of people who can draw any useful meaning from 100M rows of data.