Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
We are running a Qvw in QMC to extract all the fact tables from the netezza DB.
was observed the path where Qvds are created all the Qvds are created within 5 hours but still the Qvw is not going to exit script in the QMC even after 7 more hours.
So by thinking that our required qvds are already created we killed that reload job..
Now the problem is when we use that qvds in the final UI application it is taking lot more time than usual (more than 20 hours).
Now thinking that some thing wrong with the Qvds it self though the size is normal when we compare with prev month qvds.
Any chances for corruption to the already generated qvds if the reload killed forcefully...
Please send your thoughts on the above, Thanks
Hi,
Yes if you kill it while it is storing in QVD, it may corrupt it, but if it is updated properly I dont see any impact on QVD.
Regards,
Kaushik Solanki
Hi Koushik , Thanks for the quick response
yes it was updated and i compared the size of the qvds after they generated, all qvds are updated.
i mean to say even after updating if the qvw killed then any impact to its related qvds ???
No,
I dont think so it will have impact, but it is important for you to find the reason why the Qvw is not closed.
Regards,
Kaushik Solanki
Yes that is biggest thing that am trying to find out from the last 2 days, But no clues on it but as you told some thing is wrong. was using one forloop(to read the old qvds and convert them into new once) before the exit script but i dont think it will cause any problem as it is well tested in dev and worked fine.
If I look on the various questions from you to this topic I think it would be useful to split your tasks into several tasks and further to slice your rather big qvd's to smaller ones maybe on year or yearmonth level. Of course you will add some overhead but it will be a lot easier to handle these smaller tasks / qvd's and especially a potential trouble-shooting is a lot faster.
Additionally I suggest to consider if (more) incremental load-approaches could be applied. They are not only useful by loading rawdata else within a second/third data-layer, too.
- Marcus