Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Dear Experts,
I have a job which updates 4 fields every day and the volume of the data is 1000000.
I tried all the below options.
tDBOutput - update action -> 4 hours (tried increasing batch rows)
tDBRow - update query but same time line
tDBOutputBulkExec - looks very good less than 5 sec but i am having problems.
my destination table schema for those columns are (bigdecimal, bigdecimal, varchar(20), TimeStampWithouttimezone)
i am getting multiple error, bigdecimal type, varchar should not above 104222222 and timestamp error.
what could be the reason, is it bcaz of this component generates tmp file and changing the data types?
if yes, how change the data type as per the table.
For the tPostgresqlOutputBulk, advanced settings I use:
Null String: ""
Field Separator "\t"
Escape char "\\"
Text enclosure "\""
And match that up on the tPostgresqlBulkExec for advanced file type = CSV File
I had problems using the default control settings.
Thank you.
tDBOutputBulkExec - works well after corrected the proper schema and data types.
The questions now: the tDBOutputBulkExec file is not accessible if the save the .csv file in another directory on the same machine. giving permission error.
- the .csv file for the bulk component tDBOutputBulkExec should be on the same server were my destination DB exist?
- can i save the .csv file for the tDBOutputBulkExec component another dir on the same machine? if so that file should have full permission to read?
- will the tDBOutputBulkExec save and read the .csv file if my job server and db server is in different machine?