Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Discover how organizations are unlocking new revenue streams: Watch here
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tPostgresqlOutputBulkExec : invalid byte sequence for encoding "UTF8"

Hello,
i have a tPostgresqlOutputBulkExec with the following error :
Exception in component tPostgresqlOutputBulkExec_1_tPBE
org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0xe86365
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:1592)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1327)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:192)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:451)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:336)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:328)
All is fine when I use a limit of 10, 1000, or 10 000 rows.
But when i read 100 000 rows, I have this error.
Can someone explain me this error ?
How can i find which column is the source of error ?
thanks a lot, i'm blocked ! 😞
Labels (4)
2 Replies
Anonymous
Not applicable
Author

Hi
If some of your 100.000 rows are not UTF-8 compliant, you probably don't want to manually re-write those rows.
Try change your tPostgreSQLOutputBulk encoding (in Advanced Settings) to best fit your database encoding.
Which component(s?) do data come from ?

Regards,
Lie
Anonymous
Not applicable
Author

Thanks Lie.
I have changed the encoding in the Advanced settings, and now my 100 000 rows are well inserted !
I try with the entire table (more than one million rows).
Regards