Hi! I'm trying to import a log file (say a telephone calls log file) and split it into two different tables: one will be for the people actually calling, and the other one to log each call they make.
This is my file input format:
PhoneNum, CallTime, DurationTime
And this is, more or less, my DB structure right now:
+------------------+ +---------------------+
| Persons | | Calls |
+------------------+ +---------------------+
| PersonID (PK) | | CallID (PK) |
| PhoneNum | | PersonID (FK) |
| (...) | | DurationTime |
+------------------+ | (...) |
+--------------------+
I managed to open the file, and split it out using the tUniqRow component, which gives me two paths, one for the Uniques (sent to a tMap and fed into the data for the two tables 'Persons' and 'Calls') and one for the duplicates.
When I get the duplicates, I feed this into another tMap, that tries to get the already inserted record in the 'Persons' table (inserted from the first , but as the Lookup flow is only read at the startup of the whole job and cached into a Hashtable, the lookup for the Person returns null, and I can't get the 'Persons' PK from that table.
Is there any way to make the tMap component access live data instead of the cached data?
Thanks in advance!
Hi
Output the duplicates to a temporary file and create an another subjob(get duplicates from this temporary file and do the lookup) .
tFileInputExcel_1---On Subjob Ok-->subjob.
Best regards
shong