Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Updating 1000000 (1m) records in postgres db takes long duration

Dear Experts,

I have a job which updates 4 fields every day and the volume of the data is 1000000. 

I tried all the below options.

tDBOutput - update action -> 4 hours (tried increasing batch rows)

tDBRow - update query but same time line

 

tDBOutputBulkExec - looks very good less than 5 sec but i am having problems.

my destination table schema for those columns are (bigdecimal, bigdecimal, varchar(20), TimeStampWithouttimezone)

i am getting multiple error, bigdecimal type, varchar should not above 104222222 and timestamp error.

 

what could be the reason, is it bcaz of this component generates tmp file and changing the data types?

if yes, how change the data type as per the table.

Labels (2)
1 Solution

Accepted Solutions
manodwhb
Champion II
Champion II

Since it is saving with Talend user it I will have permissions to read that file in the same machine with different directories also. If it different server then it can not directly read it, which you need to get your local system and need to work for next operation.

View solution in original post

3 Replies
billimmer
Creator III
Creator III

For the tPostgresqlOutputBulk, advanced settings I use:

Null String: ""

Field Separator "\t"

Escape char "\\"

Text enclosure "\""

 

And match that up on the tPostgresqlBulkExec for advanced file type = CSV File

 

I had problems using the default control settings.

 

Anonymous
Not applicable
Author

Thank you.

tDBOutputBulkExec - works well after corrected the proper schema and data types.

The questions now: the tDBOutputBulkExec file is not accessible if the save the .csv file in another directory on the same machine. giving permission error.

- the .csv file for the bulk component tDBOutputBulkExec should be on the same server were my destination DB exist?

- can i save the .csv file for the tDBOutputBulkExec component another dir on the same machine? if so that file should have full permission to read?

- will the tDBOutputBulkExec save and read the .csv file if my job server and db server is in different machine?

 

manodwhb
Champion II
Champion II

Since it is saving with Talend user it I will have permissions to read that file in the same machine with different directories also. If it different server then it can not directly read it, which you need to get your local system and need to work for next operation.