row by row calculation in script is easy to maintain especially programmers have been used to seeing loops ad best used only in small data sets. imagine a loop when to get to 10s of millions of records.
there are two changes i would recommend. 1st you can create a new table using the original table as resident and move your condition to the where clause (of course you would have to tweak this depending on your business rules) . 2nd dates are dual data type so you dont need to use the num function on both sides so you save on that as well.
one other thing you may want to look at/consider - are you loading from QVDs? is there an opportunity to use QVDs? can you move this calculation to your ETL instead of QV?
Hard to say without seeing the larger context of what you are trying to accomplish. It looks like you are trying to generate rows between the delivery date and today. IntervalMatch() can be a faster alternative for this.
i myself am not sure what the design is about. like what Rob said it appears you are creating a field for each row. but the applymap could be returning a completely different value from what anyone of us suspects. if you can explain the objective that will help a lot.
of course an alternative to creating multiple rows of the same data for each day between two specific dates (assuming that is what you are after). is to create a bridge which has all the business rules for the multiple manufactured dates (or field). this way the fact table does not grow too big.
for example if you want a date record generated between OrigianalPlannedDae and Today:
tmpBridge: load distinct OrigPlanDate resident FACTTABLE;
inner join (tmpBridge) load Date resident Calendar;
noconcatenate Bridge: load OrigPlannedDate, Date resident tmpBridge
where Date >= OrigPlannedDate and Date <=today();
drop table tmpBridge;
this will create a table between your fact table and the calendar table that has a row for each day between origplandate and today. the advantage is you do not multiply your fact table rows. of course the suggestion above is simplistic based on what i can see and you will have to tweak it to suit the rest of your data model.
as to the other conditions, you can either expand on the code to build those business rules into your bridge table or move that calculation to your chart (which should be a last resort if you cant build it into your bridge)