
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Unable to write Multi byte Character(like Chinese/Japanese) to Oracle
I want to read multibyte characters like Chinese/Japanese character from excel and insert into oracle database table. So what I did for this:
I have created database table with nvarchar2 datatype column for storing multibyte character.
Then I have created excel file in talend and able to see the data in preview.(or even able to see in logRow in console)
I am using tMap to join these two (excel and database table)
It is running successfully and inserting records into the table but the inserted data is junk character (not the actual one, even I exported the data from sql developer/plsql developer in excel but its junk)
Also I am using UTF-8 as a encoding style in talend Data integration 6.2.1 as well as in Oracle Database.
Is there anything I am missing ?
Thanks in advance.
Regards,
Rakesh D
Accepted Solutions

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Have you tried to a dd the "Dfile.encoding=utf-8" to the JVM parameters to see if it works?
Best regards
Sabrina

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
it does not help, I am getting this error:
Error: Could not find or load main class Dfile.encoding=utf-8

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
try to insert chinese chars into your db using query

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I tried to insert in DB but same issue :
create table tmp_multi (str varchar2(100));
insert into tmp_multi values ('碟庫');
select * from tmp_multi;
--> ¿¿
your thinking is correct and below are my nls_database_parameters setting: Please advice which parameter I need to modify to insert multibyte character (may be NLS_CHARACTERSET):
Parameter | Value |
NLS_LANGUAGE | AMERICAN |
NLS_TERRITORY | AMERICA |
NLS_CURRENCY | $ |
NLS_ISO_CURRENCY | AMERICA |
NLS_NUMERIC_CHARACTERS | ., |
NLS_CHARACTERSET | WE8MSWIN1252 |
NLS_CALENDAR | GREGORIAN |
NLS_DATE_FORMAT | DD-MON-RR |
NLS_DATE_LANGUAGE | AMERICAN |
NLS_SORT | BINARY |
NLS_TIME_FORMAT | HH.MI.SSXFF AM |
NLS_TIMESTAMP_FORMAT | DD-MON-RR HH.MI.SSXFF AM |
NLS_TIME_TZ_FORMAT | HH.MI.SSXFF AM TZR |
NLS_TIMESTAMP_TZ_FORMAT | DD-MON-RR HH.MI.SSXFF AM TZR |
NLS_DUAL_CURRENCY | $ |
NLS_COMP | BINARY |
NLS_LENGTH_SEMANTICS | BYTE |
NLS_NCHAR_CONV_EXCP | FALSE |
NLS_NCHAR_CHARACTERSET | AL16UTF16 |
NLS_RDBMS_VERSION | 11.2.0.4.0 |

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi All,
I know this thread has been answered to but i am having issues exporting a table from a Teradata source that has Chinese characters. The job I have made is as below:
However the output I am getting is show here, where the sub are actually where the Chinese characters are supposed to be.
Can someone please help me on this?
thanks

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hi denis,
Thank you for the reply!
Sadly it didn't work for me.
Seems like I don't have the UTF-16 option from the drop down list and had to put it as custom.
