Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
AguWolkovicz
Partner - Contributor II
Partner - Contributor II

Zero-downtime Backup in Qlik Sense

Dear Community,

We are working on a way of achieving Zero-downtime backup.

As stated in this Article:

As described in Qlik Sense Help on Backing up and restoring a site, the key requirement for successful backup is to stop all services. This means that the environment is forced to a short period of downtime when back up is executed. 

It is theoretically possible to take backup of the repository (PostgreSQL) database without stopping the service. It is not possible to ensure that app files and static content are in the same state as the database if Qlik Sense services are not stopped prior to backup.

All Qlik Sense services must be stopped to ensure that database, app files, and static content are in the same state in the backup. Important, in a multi-node environment, all services on all nodes must be stopped prior to backup. 

The only way to accomplish this is to stop all Qlik Sense related services prior to backup.

In order to minimize user-impact even more, we are planning on performing such Full-backup process once a week, for example, over the weekend. This way, we are minimizing the period of service downtime in case there are users accessing from different timezones.

Of course, between each full-backup, there's a 7-days gap where any kind of disaster can take place. Worst case is, a second before this week's full-backup so that we lose almost 7 days of everything.

On a daily basis, we are planning on saving everything that can't be recovered just by querying the production/staging/whatever database, or any other data source; if there's too much new data, then it'll take longer for the QVDs to get up-to-date with the database, and that's all (i.e.: we only need time, and patience).

The daily-backups will be constituted by some Spreadsheets we use for configuration purposes, all of our Qlik Applications (with no data), Tasks, Data Connections, Users, Streams, etc.; anything that could've been created and can't be just rebuilt or reconstructed. In this regard, we are planning to cover the Spreadsheets by using a script to save and upload them to a Repository, and the Qlik Applications by automating the Qlik-Cli feature that allows us to access the API call by which we can export Qlik Applications without data.

That leaves the Tasks, Data Connections, Users, Streams, etc. All these pieces can't be exported from the QMC or an API call just as easy as Qlik Apps. We are planning on dumping only those tables from the database that have information that's relevant for each of those components. For example, for each Task we would like to save which App is triggered and what are the configured triggers (if it's an event trigger, which is the Task that the current is chained to; if it's a scheduled trigger, which is the time period).

Also, an important note is that we assume that, for the time being, the restore process for what'll be daily backed up will be manual.

Finally,

Does this approach make sense? Are we in the right direction?

Did anybody ever face a challenge like having to guarantee zero-downtime backups?

Does anybody have another approach to achieve this?

Does anybody have a summary of which tables we should query in order to have the information we need to be covered on a daily basis?

Any directions, bits of advice, suggestions will be more than welcome!

Thanks in advance!

Agu.-

1 Reply
rohitk1609
Master
Master

Few points from my side:
1. I am not agree when you say we need to stop all nodes services in multi node setup for backup QSR. Stopping the services on master node is enough except Repository Database service.

2. I don't believe we are backing up any static content from share folder by QSR backup. QSR backup file has config data. For physical data like apps, static content, extensions and images you have to go with any common file backup process.

3. If you make QSR backup without stopping services did you face any problem, I did it many times and it works fine for me.

4. When you talk about daily backup, yes you can export the apps and I have seen one doc where you can copy and paste such apps which is refreshed from last time not all, on other hand I am not sure by CLI you can save tasks, streams etc , because these are meta data not physical.

5. When you talk on export only important information, first how do you define what is important, if you ask me I can say everything is important and lets say you export important tables, do you have any PostgreQSL database developer who can work on these tables and restore it manually.

Solution: First try to take backup without stopping the service and try to restore it if you can't even have two minutes daily for down time. How much time a backup process needs, 2 minutes max. schedule the task by window task manager and if you think it will take more than two minutes , set failover time to 1 minute then your fail-over candidate will take the charge. You need to backup QlikShare folder separately.