Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026 Agenda Now Available: Explore Sessions
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

UTF-8 BOM Encoded File Processing

We are getting a daily file in UTF-8 BOM encoding because of which our Talend ETL Job always misses the first row of the file

 

Sample Data in File:

P, 1234, $10

Q,1235,$20

R, 1236, $15

 

Our actual flow is like

tFileList ==>> tFileInputDelimited ==>> fReplicate ==> tFilterRow ==>  tMSSqlSCD

 

 

Actually tFileInputDilimited is able to process all rows but when we use tFilterRow, but it always misses first row of every particular file

The condition for tFilterRow is column0  Equals  "P"

 

When we configured tLogRow we found few special characters prefixed with the first rows of all files. Example ???P

Also when we opened our CSV files in Notepad++ we discovered that File is encoded in UTF-8-BOM

We have option only for UTF-8 in Advanced settings of tfiledilimited

 

Let us know how can we process UTF-8-BOM file using Talend job

 

Thanks & Regards

 

Labels (2)
13 Replies
Anonymous
Not applicable
Author

@Andries  Can you please tell me what is the details of your local file you are trying to upload (size, number of rows / columns) ?

I also reproduce the error with a dataset (10k rows / 32 columns) but i can open it and view my data. 

Can you open/view your dataset please ?

 

Anonymous
Not applicable
Author

It is with every import, every file, every data preparation step... always
Anonymous
Not applicable
Author

@Andries Can you please update the Info.plist file (in the /Applications/Talend Data Preparation Free Desktop.app/Contents folder or right click on the Talend Data Preparation icon and select view the contents of the package). 

Then add the entry (like in the attached screenshot)

<string>-Dhystrix.command.default.execution.timeout.enabled=false</string>

Capture d’écran 2019-02-14 à 12.17.48.png
Anonymous
Not applicable
Author

Thank you, that seems to solves it !!!