Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
Rathesh
Contributor
Contributor

Data truncation error

Hello,

I have created a job to load data from SQL Server on-prem to Azure SQL Server. Source table is having 25 million records. My job flow is tMSSQLInput-->tMap-->tFileOutputDelimited-->tAzureStoragePut-->tDBRow. I am using bulk load script inside tDBRow, which will fetch the data from blob storage ​and insert into DB. Source and Destination tables are having the same schema. While running the job in tDBRow, I am getting data truncation error. When I checked file(in tFileOutputDelimited), length is same. Source and Destination column size are NVarchar(120). While writing into a file it is string with length 60. How to resolve this?

Labels (10)
2 Replies
Anonymous
Not applicable

Hi

Maybe the error occurs on another column, not the column whose size are NVarchar(120) you mentioned. I would suggest to do more debugging to find which column/records throws the error.

 

Regards

Shong

Rathesh
Contributor
Contributor
Author

Thanks for your response. Issue is with the encoding. When i changed to UTF-8, it worked. Currently I am facing com.microsoft.sqlserver.jdbc.SQLServerException:Read timed out error. When anyone of the job failed due to this all the other jobs running in parallel are failing