Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
toni_lajsner
Contributor III
Contributor III

Failed to write a file to the final destination

Hi all

Sometimes we are getting following log messages :

Failed to move data file 'F:\Attunity\Replicate\data\tasks\file.csv' to the final destination.

and 

Failed to check if a file should be closed or failed to close a file.

and

Failed to write a file to the final destination.
Write entire file failed: source = '.csv' open type = 2
Failed to write entire file (second trial)
Failed to upload <.csv>
JAVA_EXCEPTION, message: 'io.swagger.client.ApiException: java.net.UnknownHostException: depprdweuadls002.dfs.core.windows.net'

My question is what is the best practice as we do not like reload the tables as they can be large.

We are using ADLS as our target with following File size 20480 kb

 

Labels (3)
6 Replies
Pedro_Lopez
Support
Support

Hi Toni,

 

Thanks for reaching out. Usually these errors are related to a disconnect from the source. Do you see any other errors coming in the logs? Do you know if the Source got disconnected at some point?

 

 

Regards,

Pedro

toni_lajsner
Contributor III
Contributor III
Author

Do we need to be concerned about the massage , I am thinking about missing a records and reloading the task?

Pedro_Lopez
Support
Support

Hi Toni,

This message could lead to data missing as the file is not closed correctly. Did you have any disconnect from source? 

I'd suggest raising a support case with us, enable TARGET_APPLY on TRACE and attach the diagnostic package of the task to the case for further analysis.

 

Regards,

Pedro 

john_wang
Support
Support

Hello @toni_lajsner ,

In general these errors/warnings will not cause data lost, as Replicate will retry to transfer the interim file again or reload the whole table again automatically. You may confirm that by reading the task log file, if you see below line after the error messages lines:

2023-01-13T05:05:11 [TASK_MANAGER ]I: Reloading table 1 because subtask #1 finished with error (replicationtask.c:2644)

And then you may probably see other lines :

2023-01-13T05:39:42 [TASK_MANAGER ]I: Loading finished for table 'xxx'.'yyy' (Id = 1) by subtask 1. <nnnnn> records transferred. (replicationtask.c:3012)

where xxx is schema name, yyy is table name, nnnnn is the rows number.

If you can see above lines then no worries, all rows will be transferred to target side by the seconds retry in the Full Load stage.

Hops this helps.

Regards,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
toni_lajsner
Contributor III
Contributor III
Author

I am not seeing "TAST_MANAGER" reload the table or loading is finished, only that connection to our target failed and “failed while preparing stream component”. 

Also looking at a massage: Target last committed record id from the previous run is '***' (streamcomponent.c:1664) i cannot find this record id anywhere, is that Qlik’s record id and how can I use this information to see if record is added to our target?

 

 

john_wang
Support
Support

Hello @toni_lajsner ,

There are 3 major types of information in Replicate task log file, they are ERROR, WARNING, and INFORMATION catalogs, from task log file we may see the components name and the logging type, for example "]E:", "]W:", & "]I:" correspondingly

For your error, "failed while preparing stream component", in general it looks like "[TARGET_LOAD ]E:", that means the target endpoint connection cannot be established, or the target component cannot startup. For this type error, you need not set additional logging level, it will print always upon an error.

Not sure it's random error, or persist. We'd like suggest you set components TARGET_LOADTARGET_APPLY to Verbose and check if it helps to solve the problem, if you need support team help, please open a ticket and attach the task Diagnostics Packages, we'd love to help.

Regards,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!