Instead of doing loads in Snowflake during Batch process where the Snowflake table is zeroed out to have no records. Create a staging table to pump the data in then do the switch of the main table and staging table to limit the downtime of the loading process. It allows for users to access the old data until the new data is added.
Thanks for this request. I can share that we are making a number of Snowflake target improvements with Replicate and will consider this as a part of that list.
Once we know more around timing/dates...I will share here.
The main concern is that when you do a batch load Qlik zeros out the table requested when doing a full load. I have seen other products use a method of having a stage table and letting the main table still contain all the data. Once the full load is done in the staging table the replication software switches the staging table to the main table naming convention and instead of having 30 minutes to hours for the load to complete there is an outage only on the switching of the table names which happens in milliseconds.
Yes, this would be really useful to limit the impact of long-running full loads - either when pushing DDL changes that haven't been auto-processed, as part of recovering from an issue or where change data capture isn't available.
NOTE: Upon clicking this link 2 tabs may open - please feel free to close the one with a login page. If you only see 1 tab with the login page, please try clicking this link first: Authenticate me! then try the link above again. Ensure pop-up blocker is off.