Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
warfollowmy_ver
Creator III
Creator III

QV 12.5.2 slow down logging. Log every <n> seconds.

I inform you.
After switching to QV 12.5.2, many documents began to work incorrectly several times a day. There are no problems in the desktop. In previous versions, we did not observe this. Logs say slow down logging. Log every <n> seconds and a timeout of 1 hour the document fails. 

Labels (1)
9 Replies
Miguel_Angel_Baeyens

First, what is the actual error of the task? What is the error in the script (you can see it in the document log)? That's what will help you find out why the task times out.

Second, this is nothing new. QlikView has been doing this for years, as soon as a task took certain amount of time to reload, regardless the result (success or fail) the QMC showed the logging slowing down. In itself it's neither good nor bad.

 

warfollowmy_ver
Creator III
Creator III
Author

@Miguel_Angel_Baeyens 

The task is closed by the service at a timeout of 1 hour. Document-level logging is enabled, but there are no document logs.
This happens on various documents that are not related in any way. For several years, this has definitely not happened. So massively. Every day, 2-5 times several documents.

marcus_sommer

I'm not sure if your issue is really related to this new QV release. Usually there should be a document-log in which it should be obvious what's the last unfinished statement - are you sure that's there really none? Within the user-properties in the tab general is a setting to save the log-file immediately which has AFAIK the effect that each load-step is directly written in the file and not only generated if the load is finished and which didn't happens if the script failed (I'm not sure if there is also a need to adjust the settings.ini from the server for it).

If this isn't possible it might be a workaround to add some manual logging - means to store after each load-statement a txt with the name of the last table + time + rowno or similar - which makes it possible to see where the execution stopped. If you know this point it will be easier to find the reason behind it - which is probably that Qlik just waits that the database or the storage/network returns a value. If there is any problem but no error-message is returned Qlik will wait forever unless a layer above - the qmc or maybe you as user - breaks it with a timeout or killing the task. And there are a lot of possibilities what may happened - the database/network/storage is (temporary) locked or broken or similar stuff …

- Marcus  

Miguel_Angel_Baeyens

I agree with Marcus.

Your settings look definitely correct, but also open the QVW with QlikView Desktop, go to Settings > Document Properties > General > Generate logfile (if that was your screenshot 3, no need to do it again).

The fact that the logs themselves are not being created looks suspicious. Apart from the QlikView product upgrade, have there been any changes to storage, network, account, access/folder rights, etc.? Is the QMC having access to the all the folders specified there (in particular, the log ones)? 

To check that, see what happens if you open the QVW with QlikView Desktop using the account that runs the QlikView services and manually reload the app  from the script editor. Does it fail? If it does, where it stops?

It might be worth logging a case with Qlik Support (https://support.qlik.com/articles/000043153) and provide as many logs and environment information as possible.

warfollowmy_ver
Creator III
Creator III
Author

I clicked this checkmark and somehow no log file appears. If you kill the process, there is no log in any way.
And the log is in all cases only after the termination of the script, successful or unsuccessful due to errors in the script. If something happens that I described above, the log of the script is absolutely not recorded.


I also checked the next point. I turned on the script log, launched the task through the console and stopped the task after half a minute, it went into aborting status and then into failed status. But there was no script log file in the folder with the task document. If the task is successful or if I artificially cause an error in the script, then there is a script log file.
I clearly see that when manually or by timeout, the task is canceled in the console or the process is killed in the desktop, the log of the script is simply not created.

I checked this on two servers of version 12.50.20000.0.

This information is easy to verify, just repeat it.
1) Make a document, for example in script LOAD 1 as [#] AutoGenerate(1000000000);

2) Turn on script logging.

3) Make a task from this document.

4.1) Next, run the task from console and stop it manually after some time. After the status aborting and failed there will be no log of the script.

4.2) Then you can change the timeout by one minute and also run the task from console so that it stops by the timeout. After the status failed there will be no log of the script.

warfollowmy_ver
Creator III
Creator III
Author

I can’t understand anything. I spent an hour and a half in the morning to check everything, wrote a post here, it definitely was - I checked in the morning and now it's gone! I will not write again, but the bottom line is that manually terminating the task or terminating by timeout does not give the script log under any settings. This information is easy to verify, I checked on two servers QV 12.50.20000.0.

marcus_sommer

Here your earlier comment - I have no idea why it is vanished ...

Community1.JPG

marcus_sommer

Again I'm not absolutely sure how the publisher really performed the tasks but AFAIK the document-log is only created once and unchangeable stored within the folder in which the called application is located. After finishing the task the publisher copies the document-log from there within the task-folder. And yes this means the task itself must be successful until the end that this happens but the origin log-file should be always there (if the above option of writing each statement immediately is enabled).

- Marcus

Brett_Bleess
Former Employee
Former Employee

Check the following path for the in-process/failed script log:

C:\ProgramData\QlikTech\QlikViewBatch

I believe that is the location where it would be if the task is not succeeding.  I thought we had changed things such that it does write out to the normal location either way again, but that may not be the case.  The other easy thing to do is increase your task timeout setting to 2 hours instead of the 1 hour to see if that helps or not, it could be that your limit is no longer long enough to cover the load time, so that is another easy thing to try to see if that addresses things too.

Regards,
Brett

To help users find verified answers, please do not forget to use the "Accept as Solution" button on any post(s) that helped you resolve your problem or question.
I now work a compressed schedule, Tuesday, Wednesday and Thursday, so those will be the days I will reply to any follow-up posts.