
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Whole Database Migration (MSSQL->MySQL) using Dynamic Schema
Hi,
I just recently tried using Talend Cloud Data Integration (Talend Studio 7.2)
and I'm figuring how to dynamically migrate a whole MSSQL server database to MySQL(blank DB).
So far, I heard that it's possible to do that with Dynamic Schema.
However, I only found a tutorial for a table (data) migration, or some post that didn't specifically explained much about settings and components to use.
Here's my job design (UPDATED!!!)
I'm using global variable to dynamically define the table, however, I'm not sure how i can define the schema dynamically.
**I supposed my query is right.
Can you let me know how I can define the schema dynamically here?
Output Component
The other option I found was to use job template, however, it doesn't help deal with syntax error that occurred during migration like the following.
sample1)
Exception in component tDBOutput_1 (Job_tbl_mem) java.sql.SQLSyntaxErrorException: BLOB, TEXT, GEOMETRY or JSON column 'msrepl_tran_version' can't have a default value at com.mysql.cj.jd
sample2)
[FATAL]: local_test.job1_tbl_mem_0_1.Job1_tbl_mem - tDBOutput_4 Invalid default value for 'mobile_inq' java.sql.SQLSyntaxErrorException: Invalid default value for 'mobile_inq' at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120) at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97) at
In addition, I was hoping to find a solution where one job could perform a whole database migration. Eg) Iterating tables in a source DB then generate an output a destination DB.
Please let me know either if I should proceed with Dynamic Schema or Job Template + job design/components/setting needed.
Thank you
- « Previous Replies
- Next Replies »
Accepted Solutions

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
You will find below a job allowing to migrate data from one database to another.
I do it from SQL server to SQL Server but you can migrate it wherever you want.
The first step is to open the connections to the two databases.
Then, you list the name of the current tables in the source.
You iterate and make a "select * from [global_variable]" with the dynamic schema.
I advise you to add the '[]' if you have tables with keywords like 'user' or 'group'.
Finally, for the target, you choose Create table if note exists or drop and create, it's up to you.
The best practices is to recover the entire table structure with "tColumnList" and do a mapping on a case by case basis (look at the example of Talend cloud of MySql migration from AWS to Snowflake, you can download the job if you still have the trial version)
Do not hesitate if you have questions.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
To use the Dynamic Schema you need to have a paid for version of Talend. It is only available there. If you are using such a version, maybe this blog post will help you....
https://www.talend.com/blog/2019/11/11/migrate-data-between-databases-one-job-using-dynamic-schema/

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am on 14 days free trial for Cloud Version.
I have read that blog a couple of times, but what I wanted to do here is a bit different ....
I wanted to have a whole database migrated (if possible with simple job design).
So if possible, I was hoping someone could help point on how I can make if work by improving my current job design.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The only difference between what this blog shows and your requirement is a section to iterate over the different tables in your DB. You would pass the table name into this job and call it from a parent job. I believe I explain the process towards the end of blog

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I see,
so in this case I have a job to iterate over the tables already. So I could connect it to 2. "Identify column names" (tJavaFlex) as in the blog?
I can kinda relate it to my case here.
However, number of columns and data types in each table differs, so I was wondering if it will still work the same?
I've used a single database, but in reality you will likely be using different databases. It doesn't make much
difference, but you will need to make sure that the database column types are the same if you are following this.
It would be possible to add some code to dynamically change the column types, but this would require extra data in
the column mapping table and some extra Java code. This is not covered here.
Is what I'm trying to do here, the case you mention here(two different databases)? (the need to add extra data and Java Code)
If yes, would you mind to help walk out the procedure to add extra data and Java Code?
Thanks a lot

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I tried to build up job design according to the blog, but here I encountered an error.
"the component is not a sub process start can not have any link on component ok/ error in input"
for DBInput1 when I tried to trigger it with OnSubjobOk.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The job described in the blog needs to be left alone. You would create a parent job and place that job inside it (using a tRunJob component). That would be linked to your table list component via an iterate link. You would create a context variable to hold the table name and pass the value from the table list to context variable in the tRunJob. You are essentially repetitively calling the job in the example but supplying the table name and a query.
Investigate the tRunJob component in the documentation. Also look at how you can build a select statement dynamically for each table using your database. You will need this before you can do this. Building a select statement dynamically will be described on sites for the database you are using.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
You will find below a job allowing to migrate data from one database to another.
I do it from SQL server to SQL Server but you can migrate it wherever you want.
The first step is to open the connections to the two databases.
Then, you list the name of the current tables in the source.
You iterate and make a "select * from [global_variable]" with the dynamic schema.
I advise you to add the '[]' if you have tables with keywords like 'user' or 'group'.
Finally, for the target, you choose Create table if note exists or drop and create, it's up to you.
The best practices is to recover the entire table structure with "tColumnList" and do a mapping on a case by case basis (look at the example of Talend cloud of MySql migration from AWS to Snowflake, you can download the job if you still have the trial version)
Do not hesitate if you have questions.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Nice solution @JohnRMK

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Updated!!
I have tried what you showed me here and I wanted to ask a few things.
1) Is 'DROP' a DBRow component? If yes, how do you set it up? use output Db connection?
What about table name, schema, query? All ((String)globalMap.get("tDBTableList_1_CURRENT_TABLE")??
2) Would you mind showing me how you send up the JavaFlex?
3) For DbInput, how you define schema for Input? I tried adding a dynamic column.... but not sure if it's right.
And table name should be left blank?
4) I tried execute the job but there's an error.
Seems like the cause was the mismatch of columns between input and output table.
Is there a way to make it when columns (both number and types) don't match?
Ideally, the output DB might have only 1 or no table at all when I will use it for actual work.
(Source -> Blank Output DB)
Please let me know if anything is unclear to you.
Thanks alot!!

- « Previous Replies
- Next Replies »