Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
See why IDC MarketScape names Qlik a 2025 Leader! Read more
cancel
Showing results for 
Search instead for 
Did you mean: 
kalyaniB1
Contributor
Contributor

Handling special char/bad data in the Talend ETL integration job

In one of the Integration job we have enabled the UTF-8 configuration to load non English characters into the tera data table. The commit has been enabled to commit every 16k rows. The bad data which exists in the source (Success Factors system) has caused the SQL Error with untranslatable error and entire batch not inserted into the table. 

We don't know which field or data that causing an issue. We have around 70 fields in total. Could you please suggest a design approach that can be follow to tackle this issue. Currently, we are using tmap component to map the source fields from SF with Teradata target fields.

 

Labels (3)
0 Replies