Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
danielgargiulo
Partner - Creator
Partner - Creator

deleting record from previous table during load

HI All,

I have several hundred files to load all with the exact same fields. I am currently using:

NMDS:

Load*

FROM

[..\..\8.Import\0.NMDS\2013\*.txt]

(txt, codepage is 1252, no labels, delimiter is ',', msq);

One requirement is to look for 'D1' records in the 'Message' Field. If it finds a 'D1' record then i need to search the records i have loaded so far for the matching Primary Key and Delete (or Flag) the matching record that has previously been loaded. Any suggestions on how this could be achieved?

Kind regards,

Daniel

1 Solution

Accepted Solutions
rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

If the delete is always followed by a new insert, could the problem be restated as you want only the last record for each PK? if so, how about:

DataTable:

LOAD *, AutoNumber(RecNo(), PK) as Seq  INLINE [

    PK, Message, Value

    1, A1, 34

    2, A1, 54

    3, A1, 67

    4, A1, 89

];

LOAD *, AutoNumber(RecNo(), PK) as Seq  INLINE [

    PK, Message, Value

    5, A1, 87

    6, A1, 23

    2, D1,

    2, A1, 80

];

INNER JOIN (DataTable)

LOAD

  PK,

  Max(Seq) as Seq

RESIDENT DataTable

GROUP BY PK

;

Will that work or am I being naive about the data?

-Rob

http://masterssummit.com

http://robwunderlich.com

View solution in original post

6 Replies
Not applicable

Hi Daniel,

Let me know if this works:

Message_inline:

load * inline

[Message

D1];

NMDS:

Load*

FROM

[..\..\8.Import\0.NMDS\2013\*.txt]

(txt, codepage is 1252, no labels, delimiter is ',', msq)

where exists(Message);

Continue to use where exists() function on all your files. This will ensure that only records relevant to D1 are loaded and rest are ignored. Flagging them is a bigger problem. Avoiding records is easier.

Anonymous
Not applicable

Hi Daniel,

I hope this can be useful. Assuming that you have in memory a load of the previous files:

1- Load your current file:

NMDS:

Load*

FROM

[..\..\8.Import\0.NMDS\2013\*.txt]

(txt, codepage is 1252, no labels, delimiter is ',', msq);

2- Left join a flag with those IDs that match your requirement:

LOAD

     *

Resident Previous_Loaded_Records

Left Join

Load:

     ID

     1  as [Delete Flag]

Resident NMDS

Where Match(Message, 'D1') > 0;

Regards,

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

I think I understand your question, but can you post a bit of sample data?

-Rob

danielgargiulo
Partner - Creator
Partner - Creator
Author

HI Rob,

Please see a QVW attached with a basic load inline statement that hopefully illustrates the concept. Note: In reality there are hundreds of tables to load. I don't think a join after the load will work as there will always be a new record with the same PK that gets added once the old one has been deleted or flagged.

Thanks,

Dan

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

If the delete is always followed by a new insert, could the problem be restated as you want only the last record for each PK? if so, how about:

DataTable:

LOAD *, AutoNumber(RecNo(), PK) as Seq  INLINE [

    PK, Message, Value

    1, A1, 34

    2, A1, 54

    3, A1, 67

    4, A1, 89

];

LOAD *, AutoNumber(RecNo(), PK) as Seq  INLINE [

    PK, Message, Value

    5, A1, 87

    6, A1, 23

    2, D1,

    2, A1, 80

];

INNER JOIN (DataTable)

LOAD

  PK,

  Max(Seq) as Seq

RESIDENT DataTable

GROUP BY PK

;

Will that work or am I being naive about the data?

-Rob

http://masterssummit.com

http://robwunderlich.com

danielgargiulo
Partner - Creator
Partner - Creator
Author

Hi Rob,

No you are not being naive and have produced a fantastic solution. I think i was over complicating this!! Your solution works based on the sample provided and I will test on the full data in the next day or two. 

The delete may not always be followed by a new insert however if it is not then the delete record would be the one left and these will be excluded anyway. The only thing I need to ensure is they are ordered 'D1' then 'A1' as the records are loaded from each file.


Thanks for your help and enjoy the week ahead.


Dan