Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Import CSV to MySQL with Dynamic Schema (error when column length is large or text type)

I wanted to import CSV files into MySQL database using dynamic schema, as I will have to deal with large amount of files.

However, there are errors when the column length is more than 200 or when the column type is text.

Is there a way to fixed or modified this? while using dynamic schema

I can import data from other database (MSSQL -> MySQL, however when I change the source to CSV it seems not work in the same way)

 

Here is the job design

0683p000009M9cL.pngImport CSV's Job Design

 

Error Message : [Table: tbl_inq]   column type text

Exception in component tDBOutput_1 (Import_CSV)
java.sql.BatchUpdateException: Data truncation: Data too long for column 'biko' at row 14

Error Message: [Table: tbl_mem] column length 1000

Exception in component tDBOutput_1 (Import_CSV)
java.sql.BatchUpdateException: Data truncation: Data too long for column 'comment' at row 14

Here's each table schema

0683p000009M9cH.pngtbl_inq's schema0683p000009M9cQ.pngtbl_mem's schema

Labels (4)
5 Replies
manodwhb
Champion II
Champion II

Increase the column length for comment. Since the data for that column you were getting more than the existing length.
Anonymous
Not applicable
Author

@manodwhb 

Here I tried to increase column length to 500.

However, the overall row sized become too large and the following error occurs.

0683p000009M9cV.pngJob Design (increase column length)

JavaRow's Code

 

Dynamic dyn = row1.dynamic_column;

for(int i = 0; i < dyn.getColumnCount(); i++){
	DynamicMetadata meta = dyn.getColumnMetadata(i);
	meta.setLength(500);
}
row3.dynamic_column = dyn;

 

 

manodwhb
Champion II
Champion II

Set the length to 1000
Anonymous
Not applicable
Author

@manodwhb
I set the length in tJavaRow and tDBOutput's column length to 1000 but still the same error occurred.

Exception in component tDBOutput_1 (Import_CSV)
java.sql.SQLSyntaxErrorException: Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs

Anonymous
Not applicable
Author

@manodwhb 

I think I know the problem now (but no idea about the solution yet...).

I'm using dynamic schema(column), so each column's values (from Raw CSV) are joined together into a row called 'dynamic_column' below.

'Dynamic_column'(row)'s  will work if the total number of characters per row is less than 250 characters.

If the total number of characters is more than 250 characters, 'row size is too large' error will occurred. 

 

Raw CSV

0683p000009M9dT.pngCSV's Row Sample

Dynamic Row (will work if the total character length is less than 250)

0683p000009M9dY.png

ADA00 - ADAAA - 株式会社○○○○(○○○○) - ○○ ○○ - 104-00○○ - 東京都○○○○ - ○○○○ - 03-○○○○-○○○○ - 03-○○○○-○○○○- info@○○○○ - 月~金曜 09:30~18:00<br>土曜 09:30~15:00、日曜・祝日休み - 1969/12/31 15:00 - 1969/12/31 15:00 - 34200 - 34200 - 弊社は○○○○のタイ旅行専門店です。○○○○○○○○○○○○○○○○○○○○"の日本販売総代理店でもあります。

 

However, the total row size limit is 65,535 bytes, so the row's total character limit should be working for 250 or more characters as well.

MySQLSyntaxErrorException: Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. 
You have to change some columns to TEXT or BLOBs

 

In this case, it will likely be impossible to import CSV files with more 30 columns, which I think is not realistic in real-life case.

Is there a way to import CSV files with large number of columns(large row size)??