Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have a table with 50 Milion + records in Hive. I am trying to read those records in QVD. However post 25 M , QVD is failing.
1. Is there any restriction on number of records a QVD reads ?
2. I am thinking of genrating 5 QVD's with 10 M records each and then fetch records from these 5 QVD's into dashboard (QVW). Is there any mechanism to dynamically distribute records in number of QVD's ?
3. Will my final Dashboard(QVW) will be able to withhold 50 M + records ?
Please suggest and guide on apporach and performance improvement.
Thanks in advance..
1. >>Is there any restriction on number of records a QVD reads ?
No, but there may be restrictions caused by your data source.
2. >>Is there any mechanism to dynamically distribute records in number of QVD's?
No automatic mechanism, but you can write your load/save process so that it partitions the QVD
3. Will my final Dashboard(QVW) will be able to withhold 50 M + records?
Technically no (subject to available RAM), but 50M rows is a lot for a dashboard.
Thanks a lot Jonathan,
For 50 M records to load in QVD, How much RAM will required ? Curently I have 16 GB on my system, and QVD is failing post 25 M..
I am planning to have a logic which will load records in different QVD's in chunk.
Regards,
Onkar