Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I want create a generic talend job with will load data from redshift to oracle table I have around 10-15 table for now .
For example:
Table1(Redshift) -> Generic_Talend_Job -> Oracle Table 1
Table2(Redshift) -> Generic_Talend_Job -> Oracle Table 2
the metadata for each source and target are different.i.e oracle and redshift table structures are same . But table 1 structure and table 2 structure are different .
I have seen option of dynamic data type in some blogs , but not sure how to incorporate the same for my case .
Sample job screenshot and configuration will be really helpful .
Hello,
Please take a look at this troubleshooting-development topic: https://community.talend.com/t5/Troubleshooting-Development/How-to-process-changing-data-structure/t...
Let us know if it is OK with you.
Best regards
Sabrina
Hi Sabrina
I create a question for this. https://community.talend.com/t5/Design-and-Development/How-to-use-Dynamic-schema-to-bulk-load-into-r...
I review your link https://community.talend.com/t5/Troubleshooting-Development/How-to-process-changing-data-structure/t.... I also manage to create one job of shipping from CSV(tFileInputDelimited) to Oracle(tOracleOutput). Do you want why it doesn't support Bulk load to readshift? (tRedshiftOutputBulkExec)
I test it in both 6.3.1 and 7.0.1 .
Thanks,
Bin
Found this link https://community.talend.com/t5/Design-and-Development/Which-components-provide-the-Dynamic-Schema-f... .
I check supportDynamic.txt from both 6.3.1 and 7.0.1 . It only includes tRedshiftInput/tRedshiftOutput . Therefore, there is no support bulk upload to redshift with Dynamic schema. Anyone can suggest how to do it DIY.
Thanks,
Bin