Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Streamlining user types in Qlik Cloud capacity-based subscriptions: Read the Details
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Data Migration Oracle SQL Server to PostgreSQL Server Databases

Hi Everyone,

we have tried multiple approach for History data load from Oracle SQL Server to PostgreSQL Server DB. 
Talend provided perfect History data transaction One DB to Other DB. But its running very slow. we have very bigger tables (60 to 100 millions).

 

Could you please help in improve migration speed (data load speed) using TALEND Data Integration Tool for 60 to 100 million tables data transfer. 

 

Data loading very slow 30 rows per sec. 

we have used below components for data transfer:

0683p000009M6BD.png

 

Labels (2)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

Hi,

 

    Please use Bulk Output components for PostgreSQL if you are planning to process millions of records. tDBOutput is not meant for processing bulk transactions. If you are using tDBOutputBulkExec, the processing time will come down drastically. 

 

     Please also increase the java memory parameters of the job so that throughput will be better.

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

View solution in original post

6 Replies
Anonymous
Not applicable
Author

Hi,

 

    Please use Bulk Output components for PostgreSQL if you are planning to process millions of records. tDBOutput is not meant for processing bulk transactions. If you are using tDBOutputBulkExec, the processing time will come down drastically. 

 

     Please also increase the java memory parameters of the job so that throughput will be better.

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

Anonymous
Not applicable
Author

Thank you Nikil for your suggestion.

 

Oracle SQL DB in different server. PostgreSQL DB in different server. but Talend Installed in my local system. 

Network transaction very slow for this approach. Is Bulk output option work in this approach? or shall i install Talend in other server ?

 

Please suggest which is the best approach for data migration for big tables. PostgreSQL DB server Unix Based Linux system.

 

Thanks,

srini

 

Anonymous
Not applicable
Author

Hi Nikil,

 

we have tried option with  tDBOutputBulkExec its creating csv file in local system instead of insert data into postgresql server table.

 

how to upload data into one server db to other server db using  tDBOutputBulkExec.

 

Thanks,

Srini

Anonymous
Not applicable
Author

Hi,

 

    Since you are having network related issues, the first step you will have to do is to remove your system from migration equation. Why don't you create the Talend job and run it as executable from target PostgresSQL server itself? This approach is just for migration and after that you can remove the jobs.

 

    In this way, only two servers are present in the migration equation and you can use Bulk process to push the data to target system. I hope this approach will resolve your issue.

 

    Please spare a second to mark the topic as resolved once its completed.

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

Anonymous
Not applicable
Author

Thank you Nikil,

Bulk load option working fine for bigger tables. its loading very fast in server.

 

But we are facing date format issue with Bulk Execution with csv file. Oracle source table having "DD-MON-RR HH24:MI0683p000009M9p6.pngS" (or) "DD-MON-RR"  date format, but target PostgreSQL require "yyyy-MM-dd HH:mm:ss" timestamp format. 

 

how to convert date to timestamp format during BulkExec load? Date ("DD-MON-RR HH24:MI0683p000009M9p6.pngS" (or) "DD-MON-RR") to timestamp ("yyyy-MM-dd HH:mm:ss") format.

 

Issue:

ERROR: date/time field value out of range: "15-11-2005"
HINT: Perhaps you need a different "datestyle" setting.
CONTEXT: COPY ser_rcrepm_tb_temp, line 2, column repm_work_fnsh_estmt_dtime: "15-11-2005" 

 

Thank you! for your help in advance.

 

Regards,

srini

Anonymous
Not applicable
Author

Hi,

 

    You can add a tMap in between add change the date format for the required columns to the format of your choice. Since the data is already in date format, you will have to just change the format in output section of tMap.

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂