Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

How to resolve or narrow down the below error?

Hello,

I am getting the below errors logged in the task log:

The task "Knova KM Analytics 811" failed. Exception: || QDSMain.Exceptions.TaskFailedException: Task execution failed with errors to follow. ---> QDSMain.Exceptions.ReloadFailedException: Reload failed ---> System.Threading.ThreadAbortException: Thread was being aborted. ||    at System.Threading.WaitHandle.WaitOneNative(SafeHandle waitableSafeHandle, UInt32 millisecondsTimeout, Boolean hasThreadAffinity, Boolean exitContext) ||    at System.Threading.WaitHandle.InternalWaitOne(SafeHandle waitableSafeHandle, Int64 millisecondsTimeout, Boolean hasThreadAffinity, Boolean exitContext) ||    at SolutionGlobal.ThreadPool.ThreadPoolJob.SafeWaitEvent(WaitHandle eventWaitHandle) ||    at QDSMain.ReloadTask.Reload(String fileName, TaskResult taskResult, String sectionAccessUserName, String sectionAccessPassword, eReloadOptions reloadOption, String variableName, String variableValue, Boolean moniterCpuUsage) ||    --- End of inner exception stack trace --- ||    at QDSMain.ReloadTask.Reload(String fileName, TaskResult taskResult, String sectionAccessUserName, String sectionAccessPassword, eReloadOptions reloadOption, String variableName, String variableValue, Boolean moniterCpuUsage) ||    at QDSMain.DistributeTask.Execute(TaskResult currentTaskResult) ||    --- End of inner exception stack trace --- ||    at QDSMain.DistributeTask.Execute(TaskResult currentTaskResult) ||    at QDSMain.Task.AbstractTask.TaskExecution(ILogBucket logBucket, TaskResult taskResult)

The screenshots from the task log and the document log are attached.

Can anyone help me understand the issue resolve this error?

Labels (2)
11 Replies
Gysbert_Wassenaar
Partner - Champion III
Partner - Champion III

Check the document log for errors. The document log will be in the same directory and have the same name as the .qvw document with .log appended.


talk is cheap, supply exceeds demand
Not applicable
Author

Thanks for the reply. It doesn't show any errors. The last line in the screenshot is the end of the document.

1/26/2017 17:44:36.7618418Information1/26/2017 5:43 pm:       Joining/Keeping
1/26/2017 17:45:36.2610046Information1/26/2017 5:44 pm: 1965  drop field rsdSubscribedFlag
1/26/2017 17:46:36.1969730Information1/26/2017 5:45 pm: 1966  drop field rsfSubscribedFlag

This is the last record present there.

Thanks,

Karan

Gysbert_Wassenaar
Partner - Champion III
Partner - Champion III

Check that your system did not run out of memory.


talk is cheap, supply exceeds demand
rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

It could be that it timed out. Check the Timeout value in the trigger definition,

-Rob

Not applicable
Author

Hi Rob,

The reload was successful a day ago, although it took around 19 hrs. It failed after running for around 16 hrs in this instance. Since we haven't changed the timeout settings, I am ruling out the possibility of a time out.

Not applicable
Author

We have over 700 GB of memory on that server, and the RAM is also more than 96 GB. and we have enough free space.

Thanks,

Karan

Gysbert_Wassenaar
Partner - Champion III
Partner - Champion III

Is 96 GB the amount of ram in your machine or is it 700 GB and is the document size 96 GB? Reloading a document may require more ram than the size of the resulting document. This is because temporary tables may be calculated that will be discarded later in the script. Also the .qvw document could be stored in a compressed form. In ram it will be uncompressed.


talk is cheap, supply exceeds demand
Not applicable
Author

the RAM is 96GB.. the qvw file size is aournd 250MB only.

mmpas4887
Contributor III
Contributor III

Rob is correct. The message "Killing the Qlikview Engine" Means either the task timed out, the task was aborted manually, or the process was killed on the server.