Skip to main content
Announcements
SYSTEM MAINTENANCE: Thurs., Sept. 19, 1 AM ET, Platform will be unavailable for approx. 60 minutes.
cancel
Showing results for 
Search instead for 
Did you mean: 
Brownie
Contributor III
Contributor III

Calculate End date from the start date of the next row using peek

where EID = Employee ID , Dep ID, Dep Name, Start Date , End Date

the data should be like this 

1/1/1995          11/30/2005

12/1/2005         12/31/2014

1/1/2015            Todays Date as theres no other start date  for employee one. i.e last dep of emp 1

and so on   

Screenshot_3.pngScreenshot_4.png

Labels (4)
2 Solutions

Accepted Solutions
RsQK
Creator II
Creator II

Hi, this should do it:

temp_data:
LOAD
EID,
DID,
DN,
DATE(DATE#(SD,'M/D/YYYY')) AS SD
INLINE [
EID|DID|DN|SD
1|1|IT|1/1/1995
1|2|Sales|12/1/2005
1|5|Finance|1/1/2015
2|1|IT|1/1/1995
2|3|Marketing|12/1/2008
3|2|Sales|1/1/1995
4|3|Marketing|1/1/2010
] (DELIMITER IS '|');

temp_data2:
NOCONCATENATE
LOAD * RESIDENT temp_data
ORDER BY EID, SD DESC;

DROP TABLE temp_data;

data:
LOAD *,
IF(PREVIOUS(EID)=EID,DATE(PEEK(SD)-1),DATE(TODAY())) AS EndDate
RESIDENT temp_data2;

DROP TABLE temp_data2;

View solution in original post

robert99
Specialist III
Specialist III

Hi @Brownie 

Changeto 

applymap('Map1', RN1Temp,today())

and hopefully it should work

Let me know if it doesn't (or if it does) and I will look at this tomorrow when I have more time if required.

 

and mapping load

RN2Temp as RN1Temp. (in fact the name change is unnecessary. I thought it would make it a bit clearer)

 

Thanks

View solution in original post

8 Replies
RsQK
Creator II
Creator II

Hi, this should do it:

temp_data:
LOAD
EID,
DID,
DN,
DATE(DATE#(SD,'M/D/YYYY')) AS SD
INLINE [
EID|DID|DN|SD
1|1|IT|1/1/1995
1|2|Sales|12/1/2005
1|5|Finance|1/1/2015
2|1|IT|1/1/1995
2|3|Marketing|12/1/2008
3|2|Sales|1/1/1995
4|3|Marketing|1/1/2010
] (DELIMITER IS '|');

temp_data2:
NOCONCATENATE
LOAD * RESIDENT temp_data
ORDER BY EID, SD DESC;

DROP TABLE temp_data;

data:
LOAD *,
IF(PREVIOUS(EID)=EID,DATE(PEEK(SD)-1),DATE(TODAY())) AS EndDate
RESIDENT temp_data2;

DROP TABLE temp_data2;
robert99
Specialist III
Specialist III

Hi @Brownie 

I don't like peek. If it works great

Another way I once used (and now would always do it like this) many years back was the way I did it in PBI

Something like this

Do a TEMP load of the data using Order by EID and SD

Add 2 more rows as well

TEMP1:

load

* ,

EID & ROWNO() as RN1Temp,

EID & ROWNO()-1 as RN2Temp

from -----------

order by etc   ;

Then do a resident mapping load where

Map1:

RN2Temp as RE1Temp. 

SD-1 as FD

RESIDENT TEMP1 ;

Then set up the table as required, But adding the above MAP table

TABLENAME:

LOAD

*

ApplyMap('Map1', RE1Temp,today()) As FD

resident TEMP1;

DROP TABLE TEMP1 :

drop field RE1Temp , RE2Temp ;

Hope this is correct

 

 

 

Brownie
Contributor III
Contributor III
Author

Thankyou soo much if there's a way I could arrange data  in the Asc order. 

i did asc from desc the dates got messed up. 

Brownie
Contributor III
Contributor III
Author

Screenshot_5.pngScreenshot_6.png

robert99
Specialist III
Specialist III

Hi @Brownie    you need to include mapping in front of load

Map1:

Mapping load etc

Also add all the fields to TABLENAME from TEMP1. 

And drop Temp as well

Drop table Temp;

Another point. It should be RN1Temp or RE1Temp. Not both. Same for RE2Temp. Sorry my mistake 

 

 

Brownie
Contributor III
Contributor III
Author

i didn't use RE1Temp in the apply map as it wasn't picking up the field from the table map one.  This gave me the previous day

Screenshot_7.pngScreenshot_8.png

robert99
Specialist III
Specialist III

Hi @Brownie 

Changeto 

applymap('Map1', RN1Temp,today())

and hopefully it should work

Let me know if it doesn't (or if it does) and I will look at this tomorrow when I have more time if required.

 

and mapping load

RN2Temp as RN1Temp. (in fact the name change is unnecessary. I thought it would make it a bit clearer)

 

Thanks

Brownie
Contributor III
Contributor III
Author

Thankyou soo much !!