Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
The beauty with JAVA is whatever the ENCODING_TYPE you choose when you read a dataflow; the JAVA Object will convert into the expect format in output.
For example; in Talend if you choose as a constant that you want to read your INPUT_FILE with the ENCODING UTF-16 (which include UTF-8 and US-ASCII); then you can choose to have UTF-8 on the Output : tOracleOutputBulkExec.
Into the tOracleOutputBulkExec component the choice of UTF-8 for the encoding will handle the conversion of your source to the encoding for the BULK_FILE prior to be loaded with the SQLLoader.
Your job will looks like that if you have the same dataStructure; for example 5 COLUMN_FIELDS:
tFtpGet --onComponentOk--> tFileList --iterate--> tFileInputDelimited(UTF-16) --row--> tMap(for any conversion) --output--> tOracleOutputBulkExec(UTF8).
I don't have time to open my studio to deliver a proper screenshot. Let me know if my ASCII-CHARTS is not clearand I'll post a screenshot.