Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Dear Experts,
I am looking for an alternative solution for below use case.
let say: i have 1 million records in my oracle source table(car_id, date_qa[nullable]) with 2 columns.
extract the 1m records and just load it into postgres table(car_id, date_qa[not nullable]) without any major transformation only data type conversion.
below are test:
1) extract records only date_qa not null, total records 999999, and tool just 2min to load it into postgres
2) extract all records but job fail due to one record in source has null, which 3rd record and only 1,2 records got inserted
At this point: business need that one record to do the feedback loop with source system owner to correct and send next day run.
3) changed the commit=1 and all records for inserted except the 1 record, and got that using reject flow for feedback loop.
BUT, the issue is after disable the batch_size and set commit=1 , the job taking 45 minutes to complete.
please suggest, is there any other way make it better, faster.
Maybe put a "where date_qa not null" clause in your input SQL query? And then another input query date_qa = null?
Or use tmap to route nulls and non-null records? https://www.talend.com/resources/adding-condition-based-filters-using-tmap-component/