Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
deepaksahirwar
Creator II
Creator II

Error in bulk | Qlik Replicate

Dear Community/Users,

I am facing some error on Qlik Replicate task. Please can we check what is below error log related to.


00014908: 2024-02-20T19:03:23 [TARGET_APPLY ]I: Error in bulk, bulk state: bulk confirmed record id - '0', bulk last record id - '0', confirmed record id - '493319933', sorter confirmed record id - '493319933' (bulk_apply.c:2428)
00014908: 2024-02-20T19:03:23 [FILE_FACTORY ]E: azure upload failed: WinHttpWriteData: 12002: The operation timed out [1002803] (azw_upload_file.cpp:96)
00014908: 2024-02-20T19:03:23 [FILE_FACTORY ]E: Failed to upload local file 'F:\Program Files\Attunity\Replicate\data\tasks\Confidential_CRITICAL\cloud\bulk\CDC00000001.csv.gz' to container 'blob-attunity-prod' as block blob 'Confidential/0/CDC00000001.csv' [1002803] (azw_upload_file.cpp:42)
00014908: 2024-02-20T19:03:23 [FILE_FACTORY ]E: Failed to upload <F:\Program Files\Attunity\Replicate\data\tasks\Confidential_CRITICAL\cloud\bulk\CDC00000001.csv.gz> to <blob-attunity-prod/Confidential/0/CDC00000001.csv> [1000722] (at_azure_ff.c:328)
00014908: 2024-02-20T19:03:23 [FILE_FACTORY ]E: Failed to write entire file (second trial) [1000722] (at_universal_fs_object.c:685)
00014908: 2024-02-20T19:03:23 [FILE_FACTORY ]E: Write entire file failed: source = 'F:\Program Files\Attunity\Replicate\data\tasks\Confidential_CRITICAL\cloud\bulk\CDC00000001.csv.gz' target = 'blob-attunity-prod/Confidential/0/CDC00000001.csv' open type = 3 [1000731] (at_universal_fs_object.c:325)
00014908: 2024-02-20T19:03:23 [TARGET_APPLY ]E: Failed to add file F:\Program Files\Attunity\Replicate\data\tasks\Confidential_CRITICAL\cloud\bulk\CDC00000001.csv.gz to cifta list [1000731] (cloud_bulk.c:962)
00014908: 2024-02-20T19:03:23 [TARGET_APPLY ]E: Failed to close logical record [1000731] (cloud_bulk.c:352)

 

Note: Due to high confidential reason, i have renamed the application name with 'Confidential_Critical'

Thank you in advanced.

Best Regards,

Deepak

Labels (1)
1 Solution

Accepted Solutions
SachinB
Support
Support

Hello @deepaksahirwar ,

Thank you for reaching out to the Qlik community!

The error message "WinHttpWriteData: 12002: The operation timed out" typically indicates that there was a timeout while attempting to write data over HTTP. This error can occur for various reasons, including network issues, server problems, or configuration problems.

Could you please try upload this file manually from replicate server to target endpoint and see how it behaves, also check size of the file is large or small?

F:\Program Files\Attunity\Replicate\data\tasks\Confidential_CRITICAL\cloud\bulk\CDC00000001.csv.gz

 

The default query timeout value is 600 seconds, which should be sufficient for most situations. However, when loading very large tables, you may need to increase the value to prevent timeouts. This can be done using the following internal parameter and try to increase to 10x

executeTimeout

Replicate-UG 

 

Helpful-Links-Operation Timeout 

Also check the feasibility of increasing timeout option at target endpoint.

Regards,

Sachin B

View solution in original post

3 Replies
SachinB
Support
Support

Hello @deepaksahirwar ,

Thank you for reaching out to the Qlik community!

The error message "WinHttpWriteData: 12002: The operation timed out" typically indicates that there was a timeout while attempting to write data over HTTP. This error can occur for various reasons, including network issues, server problems, or configuration problems.

Could you please try upload this file manually from replicate server to target endpoint and see how it behaves, also check size of the file is large or small?

F:\Program Files\Attunity\Replicate\data\tasks\Confidential_CRITICAL\cloud\bulk\CDC00000001.csv.gz

 

The default query timeout value is 600 seconds, which should be sufficient for most situations. However, when loading very large tables, you may need to increase the value to prevent timeouts. This can be done using the following internal parameter and try to increase to 10x

executeTimeout

Replicate-UG 

 

Helpful-Links-Operation Timeout 

Also check the feasibility of increasing timeout option at target endpoint.

Regards,

Sachin B

deepaksahirwar
Creator II
Creator II
Author

Hi @SachinB ,

 

Thanks for your swift support.

Is that means the issues is from the SAP Source End system.

 

 

SachinB
Support
Support

Hello @deepaksahirwar ,

 

The issue is appearing at target endpoint when uploading the flat files.

 

Regards,

Sachin B