Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I am trying to migrate IBM DB2 tables from a schema, can it dynamically pick tables from Soruce DB and create a collection and build a schema in Mongo DB.
I tried to transfer data from a specific table to mongo db that is working fine if I predefine my schema from DB2, if not no data is transferred
Requirement is like i need to fetch schema and table name dynamically and create a collection name in Mongo db.
Need advise and inputs on its feasibility.
Wow, DB2. That takes me back. . .waaaaay back
Is there a way in DB2 to retrieve a schema programmatically (i.e. using DB2s query language)? If so, you could use tDB2Row to do it, then tFlowToIterate to iterate over the table names (most likely with tMap or other ETL components in between). I've done this with SQL Server, where I've selected all databases from sys.database, then iterated over them to run the same query against each one.
Hope this helps.
David
generally - not, Talend not a migration tool
You can test some tools like - http://mongify.com if want automate transfer
but generally it also not actual, because - nobody do not migrate data AS/IS
always need prepare some nested MongoDB structure for avoid migrating a lot of tables
Example from some project (without link above):
- MySQL have 100 tables (few huge and many small)
- We are prepare by SQL output JSON with all target structure
- Export result JSONs to txt file
- Import with Mongo tools
Finally we have 4 collection with all data from MySQL
Migrations - it always one time unique projects, and if it not make Your everyday income - no reason to create universal Job, "do and forget" or You spend more time for develop something what never will use in feature
@DVSCHWAB wrote:
Wow, DB2. That takes me back. . .waaaaay back
Hi David!
first of all - DB2 still much more more live then Microsoft want to see
second - it not resolve the problem with dynamic structure
Well the main idea behind the migration is to create a golden copy of data from IBM DB2 to Mongo DB, so thought Talend would be an option to use it, But for bulk migration and dynamic fetch of tables, I m not sure how to handle it.
Or can we Create a Java Program to connect 2 Databases and run a loop to fetch from One db and insert into another. Will that be efficient in terms of speed?
It would be helpful if i get some pointers to deal with these constraints.
You could write a Java program, although Talend is basically a Java code generator, so that might be overkill.
As to speed, is this a one-time migration, or something that will happen on a regular basis (i.e. more a copy than a migration)? If it's just one time, your program doesn't need to be efficient just to get the job done.
Can you provide a concrete example of the difficulty you're having right now? I'm not sure what you mean by "dynamic fetch of tables". An estimate of how big the DB2 database is would also be helpful.
David
did You check the link which I provide above?
what size of source database? (number of tables and in Gb)
An example would be like I have a Database with 20 to 30 schemas and in each schema we have atleast 10 to 20 tables. If using Talend, i would like to create a job to automatically to pick schema and tables from DB2 and create a corresponding database in Mongo DB(Documents and Collections) and migrate data into it.
Hope my requirement is clear.
Yes I did go through mongify, Need a clarification like will that work for IBM DB2 also. and i want to pick the schema and columns automatically rather than defining it.
Will that be possibile?
what I personally do on similar projects (have few more or less similar)
or with tool https://dbeaver.jkiss.org - select all tables and export to JSON
2. Using mongoimport, or again tools - import all to MongoDB, for example with Studio3T select all JSON files, and import at once
The main benefits - I never spend for this "problem" more than 1 day (if not required transformations)
An example would be like I have a Database with 20 to 30 schemas and in each schema we have atleast 10 to 20 tables.
40-60 mouse clicks
with moderate size of data - it not a problem at all,
if You have terabytes, need think more, but in any case not depend from way - network is would be bottleneck
on real projects, You have a target model, so You need:
- concatenate few column
- multiple other
- add math, lookup
- etc, etc,etc
but in any of this etc, etc,etc, You must know:
- which columns concatenate
- what columns multiple
- what columns use in Math formulas
- etc, etc, etc
So dynamic schema not very actual, and when need - I know how to resolve