Skip to main content
Announcements
UPGRADE ADVISORY for Qlik Replicate 2024.5: Read More
cancel
Showing results for 
Search instead for 
Did you mean: 
akalpreet
Contributor
Contributor

Loading Large tables from SAP Hana to Snowflake

Hi Team,

I am using SAP Hana as the Source Endpoint and Snowflake as the Target Endpoint for one of the Use cases for Qlik Replicate.

The problem I am facing is, I have more than 4 Billion Records in a table in SAP Hana, and while trying to add it in the task, it shows the error for Search Result Size Limit Exceeded and roll back the entire task.

I have tried dividing the task into two parts using some filtering conditions for FullLoad Passthru Filter and settings changed to Do Nothing for the task. It is taking a lot of time to load the data in the snowflake and it requires manual change of filters while reloading .

Is there any solution to change the configuration for the task, so that a single task should be able to replicate the entire (4 billion records or more ) from SAP HANA into the Target (Snowflake)?

1 Reply
lyka
Support
Support

Hello,

The Search Result Size Limit Exceeded appears to be a source related issue.  If you set the logging component SOURCE_UNLOAD to TRACE  when you reload, you will see the unload query that Replicate uses.

Please try to run the query manually on the source and check if you get the same error.

Try using Apply filters to load data whenever there is huge data in source system.

 

Thanks

Lyka