Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
load link_Sup_ADCNO, BeginningDateofSupervision + IterNo()-1 as Dup_analysis_date , S as Dup_S
from resident Supervision while IterNo() <= Supervision_End_Date - BeginningDateofSupervision +1 ;
The statement above is running in the application for more than 2 hours. I have posted this issue to others and received some good feedback to pull data from the resident table.
But I am still having performance issues , any ideas on how to make this while statement run faster and more efficiently?
Thanks
Rick
I don have a date interval date - the user can enter any date via an input box
Sorry for the typos, I don't have a date that can be used as an interval. The user enters any date they want via an input box.
I will try all these options or a mix of them and let you all know how I do?
Wish me luck!
Thanks
Rick
This isn't ideal (one of the concatenates from QVD is unoptimized), but here's a combination of incremental load and the DateKey approach I was talking about. It seems to go quite fast, including the unoptimized concatenate.
I'm getting about 42 DateKey values per row of the original table. To keep from causing trouble, I put that data in its own table so as not to duplicate rows of the original table. If you have 10,000,000 keys, you're going to have about 420,000,000 rows in this table. It may not be as big in memory as it sounds, though. I figure a couple bytes for the DateKey per row, three bytes for the regular Key, so around 2 GB being added to your data model to split all of your keys out into every date associated with those keys. Not sure if you consider that big or not, and also not sure if I've calculated correctly.
I really appreciate the time and effort on you part Sir. I will try it and let you know how I do.
I am quite impressed with your follow up and also everyone's assistance..
Quite impressive!!
Thanks
Rick