Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
duzunic88
Contributor II
Contributor II

Changing Central Node to Slave results in Monitoring Apps not showing Node logs

Hi,

 

Recently I made a change to our environment, making the central node scheduler the Slave.  This was done to guarantee that the central always handle all reloads while the other node which is consumer facing and as well the failover candidate (failover must have scheduler enabled) was made the Master.  While this works great I noticed that all my monitor apps now only show logs information for the central node and not the consumer node. On top of that, the monitor apps have no prior historical data for the central, just starting on the date when I made the Slave/Master change. 

 

I'm curious why the Slave/Master change impacted monitoring apps?  I'm not sure why making the central node handle all reloads, would impact what data it retrieves from the ArchivedLogs folder which does indeed contain both the nodes data.  Like I said this was all working fine until the Slave/Master change.  Any help would be appreciated.

Labels (1)
19 Replies
Tyler_Waterfall
Employee
Employee

Did you configure the apps for multi-node as described in the documentation?

Configuring Monitor Apps

Mainly, make sure that 

  • ServerLogFolder data connection is using FQDN of central node
  • ArchivedLogsFolder data is using FQDN to ArchivedLogs folder in the cluster share folder
  • Centralized logging is turned on now? Check to see if the apps are fetching data from Log Database or Log File by looking at the Log Details sheet of the apps.
duzunic88
Contributor II
Contributor II
Author

@Tyler_Waterfall Thanks for the info!

ArchivedLogsFolder is using FQDN and works.

Having an issue with  ServerLogFolder.  Current it is set to C:\ProgramData\Qlik\Sense\Log and is I try to changed to \\hostname\ProgramData\Qlik\Sense\Log I get the following:

Cannot open file: 'lib://ServerLogFolder/governanceLicenseLog_7.10.0_db.qvd' (Native Path: *** System error: ***)

When I tried to look at the path from the load script I also get a "Invalid Path."  Any thoughts? 

duzunic88
Contributor II
Contributor II
Author

Ah looks like I had to share the folder.  Got that to work and followed the rest of the steps but still only see my central node log data..

Like I mentioned I was seeing all nodes until the Master/Slave switch, could that cause something?

 

Bastien_Laugiero

Hello,

I did some testing with Qlik Sense June 2018 with 2 nodes (central called QlikServer1.domain.local and 1 rim called QlikServer2.domain.local)

I set Central as Slave and rim to Master but I couldn't reproduce your issue meaning that I continued to get data loaded from every nodes.

Could you check in the script logs the following:

  • The script build its list of node for the archived logs at line 0248
2019-01-09 12:02:26 0248 FOR each folder in DirList(archivedLogsFolder & '*')
2019-01-09 12:02:26 0249 node_list:
2019-01-09 12:02:26 0250 Load
2019-01-09 12:02:26 0251 'lib://ArchivedLogsFolder/qlikserver1.domain.local'&'\' as folder,
2019-01-09 12:02:26 0252 mid('lib://ArchivedLogsFolder/qlikserver1.domain.local',26) as [Node Name],
2019-01-09 12:02:26 0253 FileTime( 'lib://ArchivedLogsFolder/qlikserver1.domain.local' ) as folder_Time
2019-01-09 12:02:26 0254 AutoGenerate 1
2019-01-09 12:02:26 3 fields found: folder, Node Name, folder_Time, 
2019-01-09 12:02:26 1 lines fetched
2019-01-09 12:02:26 0256 NEXT folder
2019-01-09 12:02:26 0249 node_list:
2019-01-09 12:02:26 0250 Load
2019-01-09 12:02:26 0251 'lib://ArchivedLogsFolder/qlikserver2.domain.local'&'\' as folder,
2019-01-09 12:02:26 0252 mid('lib://ArchivedLogsFolder/qlikserver2.domain.local',26) as [Node Name],
2019-01-09 12:02:26 0253 FileTime( 'lib://ArchivedLogsFolder/qlikserver2.domain.local' ) as folder_Time
2019-01-09 12:02:26 0254 AutoGenerate 1
2019-01-09 12:02:26 3 fields found: folder, Node Name, folder_Time, 
2019-01-09 12:02:26 2 lines fetched

Could you check in your script logs if it fetches both your central and rim node?

  • If it does, further down in the logs you should see it starting to load files in the archived logs folder for your rim node 
2019-01-09 12:02:26 0333 for each textFile in FileList(logName & '*.' & extension)
          2019-01-09 12:02:26 0336 If FileTime( 'lib://ArchivedLogsFolder/qlikserver2.domain.local/Repository/Audit/QLIKSERVER2_AuditSecurity_Repository_2019-01-09T12.01.00Z.log' ) >= 43473.501689815 then
            2019-01-09 12:02:26 0338 CONCATENATE (working)  
            2019-01-09 12:02:26 0339       LOAD
            2019-01-09 12:02:26 0340         Timestamp("Timestamp") AS LogEntryPeriodStart,
            2019-01-09 12:02:26 0341         Timestamp("Timestamp") as LogTimeStamp,
            2019-01-09 12:02:26 0342         lower(Hostname) as Hostname,
            2019-01-09 12:02:26 0343 		Description,		
            2019-01-09 12:02:26 0344         ProxySessionId,
            2019-01-09 12:02:26 0345         ProxyPackageId,
            2019-01-09 12:02:26 0346         RequestSequenceId,
            2019-01-09 12:02:26 0347         ProxySessionId&ProxyPackageId as _proxySessionPackage,
            2019-01-09 12:02:26 0348         Message,
            2019-01-09 12:02:26 0349 		Service,
            2019-01-09 12:02:26 0350         Context,
            2019-01-09 12:02:26 0351         Command,
            2019-01-09 12:02:26 0352         Result,
            2019-01-09 12:02:26 0353         ProductVersion,
            2019-01-09 12:02:26 0354         ObjectId,
            2019-01-09 12:02:26 0355         ObjectName,
            2019-01-09 12:02:26 0356         UserDirectory,
            2019-01-09 12:02:26 0357         UserDirectory & '\' & UserId as UserId,	
            2019-01-09 12:02:26 0358         
            2019-01-09 12:02:26 0359         SecurityClass,
            2019-01-09 12:02:26 0360         If(Result=0 or (Result >=200 and Result <=226),dual('OK',0), if(Result=' ',dual('Blank',0), dual('NOK',1))) as Status,        
            2019-01-09 12:02:26 0361         IF(WildMatch(Message,'* access granted*')>0,SubField(Message,' ',1)&' Access') as [Access Type], 
            2019-01-09 12:02:26 0362         IF(WildMatch(Message,'* access granted*')>0,Date(Floor("Timestamp"))) as [Access Date],
            2019-01-09 12:02:26 0363         IF(Result=403,1) as UsageDenied,					
            2019-01-09 12:02:26 0364         IF(left(Message,20)='Login access granted',purgechar(TextBetween(Message,'UsageID: ',','),chr(39))) as UsageId,	
            2019-01-09 12:02:26 0365         IF(left(Message,20)='Login access granted',purgechar(TextBetween(Message,'Accessname: ',','),chr(34)&chr(39)),
            2019-01-09 12:02:26 0366             IF(left(Message,24)='Login access for license',purgechar(TextBetween(Message,' Name: ',','),chr(34)&chr(39)))) as [Login Access Rule],
            2019-01-09 12:02:26 0367         IF(((Context='/qrs/licenseadd' and not Origin = 'ManagementAccess') OR Context='/qrs/licenseupdate' or Context='/qrs/license/datamarket')
            2019-01-09 12:02:26 0368             OR WildMatch(Command,'Add * access','Update * access','Delete license')>0  
            2019-01-09 12:02:26 0369             OR (index(lower(ObjectName),'license')>=1  AND WildMatch(Command,'Add rule','Update rule','Delete rule')>0),
            2019-01-09 12:02:26 0370             1) as [License Allocation],													
            2019-01-09 12:02:26 0371 
            2019-01-09 12:02:26 0372         IF(Context='/qrs/license/datamarket','DataMarket License',
            2019-01-09 12:02:26 0373             IF(Context='/qrs/licenseadd' and Origin = 'ManagementAccess',null(),
            2019-01-09 12:02:26 0374                 IF(Context='/qrs/licenseadd' OR Context='/qrs/licenseupdate','Site License',
            2019-01-09 12:02:26 0375                     IF(left(Message,24)='Login access for license',purgechar(TextBetween(Message,' Name: ',','),chr(34)&chr(39)),
            2019-01-09 12:02:26 0376                         ObjectName)))) as [Affected Entity],
            2019-01-09 12:02:26 0377         
            2019-01-09 12:02:26 0378         Id as Id_temp		
            2019-01-09 12:02:26 0379                         
            2019-01-09 12:02:26 0380       FROM 'lib://ArchivedLogsFolder/qlikserver2.domain.local/Repository/Audit/QLIKSERVER2_AuditSecurity_Repository_2019-01-09T12.01.00Z.log'
            2019-01-09 12:02:26 0381       (txt, utf8, embedded labels, delimiter is '\t', msq)
            2019-01-09 12:02:26 0382           WHERE isnum(Sequence#)
            2019-01-09 12:02:26 0383            AND (exists(Command)	
            2019-01-09 12:02:26 0384                 OR (index(lower(ObjectName),'license')>=1  AND (Command = 'Add rule' or Command = 'Update rule' or Command = 'Delete rule'))
            2019-01-09 12:02:26 0385                 OR SecurityClass = 'License')
            2019-01-09 12:02:26      	28 fields found: LogEntryPeriodStart, LogTimeStamp, Hostname, Description, ProxySessionId, ProxyPackageId, RequestSequenceId, _proxySessionPackage, Message, Service, Context, Command, Result, ProductVersion, ObjectId, ObjectName, UserDirectory, UserId, SecurityClass, Status, Access Type, Access Date, UsageDenied, UsageId, Login Access Rule, License Allocation, Affected Entity, Id_temp, 
            2019-01-09 12:02:26      659 lines fetched

Could you check if you see similar behavior and line being fetched? 

  • Finally, have you tried to switch back the configuration again (Central=Master and Rim= Slave)? Does it work?

Thank you!!

 

 

 

 

Bastien Laugiero
If a post helps to resolve your issue, please mark the appropriate replies as CORRECT.
duzunic88
Contributor II
Contributor II
Author

@Bastien_Laugiero Thank you for your response!! Hmm it looks like my script log file is not as detailed as yours (Qlik June 2018). Attached is my License Monitor script log with sensitive information removed.  Do I need to adjust my scheduler logging levels to see more detail like yours? 

Another thing i noticed that I'm not sure is relevant, I only see information from governanceLicenseLog_7.10.0_db.qvd pulled in while when I look at an older version of the license monitor app that was working, was pulling data from governanceLicenseLog_7.10.0_file.qvd.  Do you think this has anything to do with it?

My last resort was to revert everything which I will most likely try soon to see if that fixes the issue.

duzunic88
Contributor II
Contributor II
Author

@Bastien_Laugiero Just reverted settings and restarted services and that did not fix the issue... this must be being caused by something else =/

Bastien_Laugiero

Hello, 

Thank you for the logs. 

So the reason why you don't see the same entries is that the monitoring applications are loading its data from the Qlik Logging Database in your environment and not from the log files. 

Basically, by default, the Monitoring apps will try to connect to the QLogs database. If it can then it will load the data from there. If it can't then it will use the log files instead.

I suspect that your rim node is not logging anything in that database, therefore you don't see the data in the monitoring apps. 

I would say we have two alternatives: 

  • You can force the monitoring application to load its data from the logs file. To do that: 
    • Edit the Operation and License Monitor script. (You will need to duplicate those apps in order to have them in your workspace)
    • Line 10 of the script you will see the following variable SET db_v_file_override = 0;
    • Change the value to
    • Save the change, publish and replace the duplicated application
    • Try to reload again 
  • You can try fixing the fact that the rim is not writing data in the Qlog database
    • Check the following article in that case

Hope this helps!

Bastien Laugiero
If a post helps to resolve your issue, please mark the appropriate replies as CORRECT.
duzunic88
Contributor II
Contributor II
Author

@Bastien_Laugiero Thanks again for the quick response! I think this puzzle is getting close to getting solved.. 

I followed the instructions from the article, and indeed the following shows up 'CentralizedLoggingEnabed: True' and 'LoggingToFileEnabled: True' so everything looks good there.

After setting SET db_v_file_override = 1 this does indeed work to pull data from the log files but however the governanceLogMonitor_7.10.0_file has not been updates since 12/26/2018 which is exactly when I started seeing this problem. 

Do you by any chance know what process/task updates the governanceLogMonitor_7.10.0_file QVD? I think this would be the next place I would have to go to investigate...

Bastien_Laugiero

Hello, 

So basically when you reload any of the Monitoring Applications, the load script will load the data either from Qlogs database or log files as explained earlier and will then store them into a QVD file. 

So that next time you reload those applications, you will just do an incremental load and not reload again all the data. 

The file that you mentioned governanceLogMonitor_7.10.0_file.qvd is being generated by the monitoring app called Log Monitor.

When you say this file was not updated since 12/26, is this file stored in C:\ProgramData\Qlik\Sense\Log on the rim node or on the central node?

If it's on the rim node, then it's expected that it's not being updated since it's the central node doing the reload now.

If you reload the application called Log Monitor, you should see this file being generated/updated on the central node.

Quick question, when you used SET db_v_file_override = 1, do you see the data for both nodes or still only for the central node?

Thanks! 

Bastien Laugiero
If a post helps to resolve your issue, please mark the appropriate replies as CORRECT.