Skip to main content
Announcements
A fresh, new look for the Data Integration & Quality forums and navigation! Read more about what's changed.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Import CSV to MySQL: Error when total row length is more than 250

wanted to import CSV files into MySQL database using dynamic schema, as I will have to deal with large amount of files.

Using the job design below I was able to:

1) MySQL: Import CSV files to MySQL DB when each row(not cell) total character is below 250.

2) SQL Server: Import CSV files to MSSQL with no problem no major limitation.

 

**In MySQL, "Row size is too large" occurred when the total characters in the row is more than 250.

MySQLSyntaxErrorException: Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. 

However, 250 characters is no where near 65,535 bytes limit, so my actual data (longest row 414 characters)  should've work.

 

Is there a way I could fix my job design make it work for MySQL??

** I tried to migrate the same sets of table from MSSQL to MySQL and it works, so I think those tables doesn't exceed the row size limit (technically),

but the problem may lies in the process??

 

CSV's Row Sample

0683p000009M9dT.pngCSV's Row Sample

Dynamically processed row (works in MySQL if total character here is less than 250)

ADA00 - ADAAA - 株式会社○○○○(○○○○) - ○○ ○○ - 104-00○○ - 東京都○○○○ - ○○○○ - 03-○○○○-○○○○ - 03-○○○○-○○○○- info@○○○○ - 月~金曜 09:30~18:00<br>土曜 09:30~15:00、日曜・祝日休み - 1969/12/31 15:00 - 1969/12/31 15:00 - 34200 - 34200 - 弊社は○○○○のタイ旅行専門店です。○○○○○○○○○○○○○○○○○○○○"の日本販売総代理店でもあります。

 

Job Design (uses dynamic schema/column)

0683p000009M9cV.pngJob Design (increase column length)

 

Additionally, I also tried importing CSV to MSSQL and then migration from MSSQL to MySQL.

However, 'Row Size' Error occurred again.

 

Please let me know anything is unclear.

 

Thank you

Labels (3)
8 Replies
Anonymous
Not applicable
Author

Hi
Does the table already exist before you run the job or it is created on tMysqlOutput at runtime?
Refer to this page on stackoverflow, the guy suggest to use text instead of varchar data type when creating the table.
https://stackoverflow.com/questions/13283381/row-size-too-large-error-in-mysql-create-table-query

Regards
Shong
Anonymous
Not applicable
Author

@shong
The table is created on tMysqlOutput .

>Refer to this page on stackoverflow, the guy suggest to use text instead of varchar data type when creating the table.

I'm using dynamic schema, so I could not choose data type for schema in tInputDelimited and tDBOutput.
Secondly, my data does not go over 65,535 bytes limit, but for some reason it caught on the error. (the longest row is 441 characters).

Is there to make it work using dynamic schema (MySQL Output)??
Anonymous
Not applicable
Author

Hi
If you know Java code, refer to the below links to know how to get the column metadata and change the data type of column from varchar to text before the table is created.
https://www.talend.com/blog/2019/11/11/migrate-data-between-databases-one-job-using-dynamic-schema/?...
https://community.talend.com/t5/Design-and-Development/resolved-Generic-Talend-Job/td-p/92354

Regards
Shong

billimmer
Creator III

I ran into this problem about 2 years ago.  While working with consultants on a project, I deleted a mySQL table, but when I tried to re-create it I got the error you mentioned.  I tried changing my varchar types to "text" but it wasn't enough.  When I asked the consultants how they were able to create the table i discovered that this validation can be "turned off".  So they were turning off/on this validation (I can't remember the name), when creating the table.  

 

I didn't like this as I read it was "possible" to have data loss - hence the validation.  So we ended up setting the table engine to ISAM.

 

ENGINE=MyISAM DEFAULT CHARSET=utf8mb4; 

 

 

Anonymous
Not applicable
Author

@shong
Thanks for getting back to me.
I don't know Java so I haven't tried what you mentioned but it seems to be working in case of processing one table at a time??
In this case, I'm dealing with multiple tables (sometimes up to hundred), all with different scheams, so I wanted to use a dynamic schema.

Will it work with multiple tables? If yes, would you mind showing me how? I'm not familiar with Java so I don't know how I should apply it in my case..

When I tried to migrate the same set of tables from MSSQL to MySQL, it works fine, but I don't know why it doesn't work in case of importing from CSV...


Additionally, I tried to import one table at a time, while defining the schemas (including data types) manually, however, I cannot find 'text' data type in the schema view...

Anonymous
Not applicable
Author

@lli
I see, so changing the table engine to MyISAM help resolves the error?
When I looked up on the table engine type,
>Although InnoDB supports row sizes larger than 65,535 bytes internally, MySQL itself imposes a row-size limit of 65,535 for the combined size of all columns.
I think this also applies with MyISAM as well...?
billimmer
Creator III

See https://dev.mysql.com/doc/mysql-reslimits-excerpt/5.7/en/column-count-limit.html

 

"In the following MyISAM example, changing a column to TEXT avoids the 65,535-byte row size limit and permits the operation to succeed because BLOB and TEXT columns only contribute 9 to 12 bytes toward the row size.:

 

This worked for our application.

Anonymous
Not applicable
Author

I see, but seems like MyISAM is not supporting foreign-key...
so I think I can't use it in my case ... but thanks alot