Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi everyone,
Starting with Qlik Sense September 2017, we introduced the Qlik Centralize Logging (QLogs) database. This database is managed by the Qlik Logging Service.
Databases still need space though, so deployments that were using the QLogs database for an extended period already, or larger and active deployments, may find themselves accumulating big chunks of data. At this stage, your QLogs database might be consuming several GB of disk space.
This situation can be a major problem for your System Administrator. But don’t worry. There are ways around this (housekeeping), and I’ll be going over what to do to prevent this from happening in the first place, as well as how to slim down an existing database.
Here is what we will cover today
Introduction
The Qlik Logging Service centralizes the logging by collecting all the messages and inserting them into a PostgreSQL database named QLogs.
In addition to collecting the Qlik Sense Services logs, it also consumes the Windows Event logs and the Performance Counters, such as CPU, Memory, Disk latency and queue, Network, etc
Right now, it is most commonly used as a default data source for the Monitoring Applications reload. If the Centralize Logging is not enabled, the monitoring applications will use the log files as a data source. More details to this are available in my previous post All you need to know about deploying the Qlik Sense Monitoring Applications.
Am I using the DB or flat files?
To verify if the Qlik Centralize Logging is enabled in your environment, you can run the following command from the central node.
Microsoft Windows [Version 10.0.17134.619] (c) 2018 Microsoft Corporation. All rights reserved. C:\Windows\system32> cd C:\Program Files\Qlik\Sense\Logging C:\Program Files\Qlik\Sense\Logging> Qlik.Logging.Service.exe validate archive_age_in_hours: 720 last_forced_db_trim: 2019-04-02 16:32:52.107208+02 purge_age_in_hours: 1440 version: 0.5 CentralizedLoggingEnabled: True LoggingToFileEnabled: True MaximumDatabaseSizeInGB: Max db size not enforced DB Host: QlikServer1.domain.local DB Name: QLogs DB Writer User: qlogs_writer DB Reader User: qlogs_reader DB Port: 4432
Reference and additional information available in knowledge article: How To Verify Qlik Sense Is Running With Centralized Logging Service
If you wish to set up the Qlik Centralize Logging, you can refer to this help.qlik.com - Qlik Logging Service.
How much space does it take on disk?
To check how much disk space each database consumes, you can use the following SQL Request.
I usually use PGAdmin4 to run this request. You simply need to right-click on one of the databases and select Query Tool
SELECT d.datname AS Name, pg_catalog.pg_get_userbyid(d.datdba) AS Owner, CASE WHEN pg_catalog.has_database_privilege(d.datname, 'CONNECT') THEN pg_catalog.pg_size_pretty(pg_catalog.pg_database_size(d.datname)) ELSE 'No Access' END AS SIZE FROM pg_catalog.pg_database d ORDER BY CASE WHEN pg_catalog.has_database_privilege(d.datname, 'CONNECT') THEN pg_catalog.pg_database_size(d.datname) ELSE NULL END DESC -- nulls first LIMIT 20
Reference: https://wiki.postgresql.org/wiki/Disk_Usage
To give you an example: below is the output from a single node environment without any usage, DEBUG log level enabled on all services and collected over about a week.
name | owner | size --------------------------------------------------------------------------- Qlogs | qlogs_writer | 548 MB QSR | postgres | 40 MB SenseService | qliksenserepository | 7223 kB QSMQ | qliksenserepository | 7183 kB postgres | postgres | 7055 kB template0 | postgres | 6945 kB template1 | postgres | 6945 bytes
It is also possible to narrow down to which node(s) or even service(s) is logging the highest number of entries using the following request.
SELECT * FROM ( SELECT d.process_host,d.process_name,d.entry_level,COUNT(id) AS message_volume FROM ( SELECT id,process_host,process_name,entry_level FROM public.log_entries UNION ALL SELECT id,process_host,process_name,entry_level FROM public.archive_entries ) AS d GROUP BY d.process_host, d.process_name, d.entry_level ) AS d ORDER BY d.message_volume DESC;
Output:
Process_host | Process_name | entry_level | message_volume --------------------------------------------------------------------------- QlikServer1 | proxy | DEBUG | 380020 QlikServer1 | proxy | INFO | 221839 QlikServer2 | proxy | INFO | 103589 QlikServer1 | repository | INFO | 98254 QlikServer2 | scheduler | INFO | 54269 QlikServer1 | repository | ERROR | 35874 QlikServer1 | repository | WARN | 34568
How can I better control the usage on disk?
To make certain the QLogs database does not start consuming large amounts of disk space, you need first to control and limit the data you want to be logged.
In order to not create unnecessary traffic, and to limit the amount of data logged in the database, we recommend to not set the logging level of any services to DEBUG for an extensive period of time.
DEBUG level is mostly used while troubleshooting an issue and should be set back to INFO directly after that.
It is possible to purge the logs after a certain amount of days or limit the size of the database
Purge: (Where x and y are a day’s threshold and archive_age value must be lower than purge_age, otherwise it will fail)
Microsoft Windows [Version 10.0.17134.619] (c) 2018 Microsoft Corporation. All rights reserved. C:\Windows\system32> cd C:\Program Files\Qlik\Sense\Logging C:\Program Files\Qlik\Sense\Logging> Qlik.Logging.Service.exe update --archive_age x --purge_age y
Reference Knowledge article: Qlik Sense - How to control the size of the Qlik Sense Logging Database when using Centralized Loggi...
Limit: (Where n is the size in GB)
Microsoft Windows [Version 10.0.17134.619] (c) 2018 Microsoft Corporation. All rights reserved. C:\Windows\system32> cd C:\Program Files\Qlik\Sense\Logging C:\Program Files\Qlik\Sense\Logging> Qlik.Logging.Service.exe update --maximum_db_size_in_gb n
Reference Knowledge article: Qlik Sense - How to control the size of the Qlik Sense Logging Database when using Centralized Loggi...
Help! What to do if it consumes all or almost all the available disk space?
First of all, you can take the steps mentioned above to control and limit the amount of information logged into the database.
But the process to purge or reduce the size of the database can take a long time, so if you need a quick (but short-term) solution, you can temporarily increase the size of the disk while the purging process is working.
If this is not possible you can trigger a manual archiving and then purge of the logs in the QLogs database. (Where X is in hours)
Microsoft Windows [Version 10.0.17134.619] (c) 2018 Microsoft Corporation. All rights reserved. C:\Windows\system32> cd C:\Program Files\Qlik\Sense\Logging C:\Program Files\Qlik\Sense\Logging> Qlik.Logging.Service.exe archive –cutoff_in_hours X C:\Program Files\Qlik\Sense\Logging> Qlik.Logging.Service.exe purge --cutoff_in_hours X
Reference: Qlik logging service database urgently needs to be reduced in size in by going to help.qlik.com :
If presented with a message stating the system is too busy, retry the command.
If the second try generates the same message, try again at a less busy time on the server.
Thank you for reading my blog post and I really hope it helps you.
I’ll be watching for your questions, feedback or suggestions so please comment and share!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.