Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I wanted to import CSV files into MySQL database using dynamic schema, as I will have to deal with large amount of files.
Using the job design below I was able to:
1) MySQL: Import CSV files to MySQL DB when each row(not cell) total character is below 250.
2) SQL Server: Import CSV files to MSSQL with no problem no major limitation.
**In MySQL, "Row size is too large" occurred when the total characters in the row is more than 250.
MySQLSyntaxErrorException: Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535.
However, 250 characters is no where near 65,535 bytes limit, so my actual data (longest row 414 characters) should've work.
Is there a way I could fix my job design make it work for MySQL??
** I tried to migrate the same sets of table from MSSQL to MySQL and it works, so I think those tables doesn't exceed the row size limit (technically),
but the problem may lies in the process??
CSV's Row Sample
CSV's Row Sample
Dynamically processed row (works in MySQL if total character here is less than 250)
ADA00 - ADAAA - 株式会社○○○○(○○○○) - ○○ ○○ - 104-00○○ - 東京都○○○○ - ○○○○ - 03-○○○○-○○○○ - 03-○○○○-○○○○- info@○○○○ - 月~金曜 09:30~18:00<br>土曜 09:30~15:00、日曜・祝日休み - 1969/12/31 15:00 - 1969/12/31 15:00 - 34200 - 34200 - 弊社は○○○○のタイ旅行専門店です。○○○○○○○○○○○○○○○○○○○○"の日本販売総代理店でもあります。
Job Design (uses dynamic schema/column)
Job Design (increase column length)
Additionally, I also tried importing CSV to MSSQL and then migration from MSSQL to MySQL.
However, 'Row Size' Error occurred again.
Please let me know anything is unclear.
Thank you
Hi
If you know Java code, refer to the below links to know how to get the column metadata and change the data type of column from varchar to text before the table is created.
https://www.talend.com/blog/2019/11/11/migrate-data-between-databases-one-job-using-dynamic-schema/?...
https://community.talend.com/t5/Design-and-Development/resolved-Generic-Talend-Job/td-p/92354
Regards
Shong
I ran into this problem about 2 years ago. While working with consultants on a project, I deleted a mySQL table, but when I tried to re-create it I got the error you mentioned. I tried changing my varchar types to "text" but it wasn't enough. When I asked the consultants how they were able to create the table i discovered that this validation can be "turned off". So they were turning off/on this validation (I can't remember the name), when creating the table.
I didn't like this as I read it was "possible" to have data loss - hence the validation. So we ended up setting the table engine to ISAM.
ENGINE=MyISAM DEFAULT CHARSET=utf8mb4;
@shong
Thanks for getting back to me.
I don't know Java so I haven't tried what you mentioned but it seems to be working in case of processing one table at a time??
In this case, I'm dealing with multiple tables (sometimes up to hundred), all with different scheams, so I wanted to use a dynamic schema.
Will it work with multiple tables? If yes, would you mind showing me how? I'm not familiar with Java so I don't know how I should apply it in my case..
When I tried to migrate the same set of tables from MSSQL to MySQL, it works fine, but I don't know why it doesn't work in case of importing from CSV...
Additionally, I tried to import one table at a time, while defining the schemas (including data types) manually, however, I cannot find 'text' data type in the schema view...
See https://dev.mysql.com/doc/mysql-reslimits-excerpt/5.7/en/column-count-limit.html
"In the following MyISAM example, changing a column to TEXT avoids the 65,535-byte row size limit and permits the operation to succeed because BLOB and TEXT columns only contribute 9 to 12 bytes toward the row size.:
This worked for our application.