Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Maximize ROI with Qlik Professional Services – Expert Guidance, Faster Results: Explore Qlik Services
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Update to Postgresql table takes more than 9hrs for 100000 records

Dear Experts,

I have a situation as below but the updates takes more than 9hrs to update one single column.

Source: MS SQL table

Destination: PostgreSQL table

Transformation: tMap (to check the new and old records using id)

Use case: source table contains 100000 records with 5 columns, i inserted those records with 4 columns already in destination table, now business wants 5th column too, so i have created the 5th column in destination table and added in source schema and mapped and configured in tMap too. and introduced one more output component and configured only update and selected the new column.

 

when i ran the job it takes more than 9hrs but not completed.

0683p000009M9mG.png

 

Labels (2)
2 Replies
jeoste
Creator II
Creator II

Can you show us your mapping, inside the tMap ?
100 000 rows is not big, maybe you can try to :
- insert all your data
- after the insert, update all the data you need

You can try to make only one output but with the "action on data" -> Insert or update

Another test can be the modify the output and put a tDBOutputBulkExec (which is the same as tDBOutputBulk + tDBBulkExec), it creates plain file to insert or update into your database, so it takes less time than a tDBOutput.

I'm not sure about everything, you have to test different ways and share the result here for more investigation if it don't work...
Anonymous
Not applicable
Author

Thank you Jeoste,

i tried with tDBOutputBulkExec for 1millians records to update in PostgreSQL database but it is failing and the error is below.

tdboutputbulkexec _1_ tpbe org.postgresql.util.psqlexception: an i/o error occurred while sending to the backend.

 

but for 100000 records it went fine and updated.