Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
What is limit of QVD size & how we can optimize the over sized QVD
Hi Jaime,
I have 4,590,445,300 unique entries in my qvd, and it fails every time i try to load it in QVW. QV throws an error msg saying - 'Execution of script failed, do you want to reload old data'
It would be great if you can help me out in this.
Thanks a lot in Advance
Regards,
Sagar Gupta
The most likely issue here is that you are trying to write to a folder that doesn't exist, a folder you don't have access to or a file that is locked.
Double check the folders and permissions. If you are overwriting an existing QVD, try renaming it before the load. If the file is locked it will prevent you doing this. You may need to reboot your machine to free the lock - but first check if you have any other QlikView instances open that may be holding a lock.
Hope that helps,
Steve
Hi Steve,
Thanks a lot for your reply
I have access to all the folders and files and none of them are locked. Below i will try to explain what I am trying to do:
I have 8 txt files of 2GB each, and the task is to create 800GB data out of it. Here is the approach which I follwed:
Step1 : Create qvd out of existing 8 files by concatenating them - Successful
Step2: Concatenate/append the resulting qvd 50 times to create a final qvd - Successful (resulting qvd is of 180GB with 4,590,445,300 records)
Step3: Load final qvd in QVW such that each record is unique. In order to do this I have created a new column in QV script ----- RowNo() as NewColumn. Now, when I execute this step it runs for around 18hrs and for some reason it never takes in the entire 4,590,445,300 records.
The moment i hit OK it throws an error msg 'Execution of script failed, do you want to reload old data'
Note: I have tried loading the same final qvd by limiting the number of records to 5000 to confirm that there is no Issue in final qvd
Any sort of help will be highly appreciated
Much thanks in Advance Steve Dark
Regards,
Sagar Gupta
With those kinds of row counts you need to ensure that everything is as optimised as it possibly can be. Ensure you are not taking any values you don't actually need - redundant join keys for example. Round all your numbers where possible to get to having fewer unique values.
Is there no way that you can pre-aggregate the values so that you have fewer rows?
What is the memory and CPU on the box doing during the reload? Are you simply running out of memory.
By default you will not be able to perform this reload via QlikView Server, as I seem to recall there is a four hour limit on reloads.
Hope that helps,
Steve
Hi Steve,
Thanks a lot for your quick response
Here is the machine specs :
Core: 24
Ram: 256GB
Ram utilization goes till 196GB at the time of reload but the same issue - It doesn't load the entire record
It will be of great help if you can provide your valuable inputs to optimize the QVD
Note: This data is at transaction level and we can't roll it up or group by on any field. We are also trying Direct Discovery with RedShift but there also we are facing lot of issues when it comes to performance
Much thanks in advance Steve Dark
Regards,
Sagar Gupta
1. 800GB data creation - Successful
2. Load data in QlikView - Failed
3. Load data in QlikView(Second attempt) - Failed
4. Error message
As far We have the Knowledge that QVD dose not have any Physical limitation. For crating QVD data must be in Memory so RAM is a limiting factor. For more Visit http://intellipaat.com/blog/qlikview-training-business-intelligence-software/ |