Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi all ,
I'm facing an issue with reloading a task on server 11G.i'M gettin the error log file bellow.
when i reaload the application with desktop on server it works .
and in the application there is no hidden script .
Any idea ?
Thx
(2015-02-20 11:40:59) Information: Starting task 'Market-Management-Particuliers-Book/Book - Particuliers.qvw'. Id:6fc1dd6c-dc88-4f21-a926-d5701e23aa24. Triggered by 'ManualStartTrigger'. Id:00000001-0002-0003-0405-0607080a0b0c(2015-02-20 11:40:59) Information: Entering Task Execution. (2015-02-20 11:40:59) Information: ClusterID=1(2015-02-20 11:40:59) Information: QDSID=f3a25325-89af-3ac5-a95f-2dbce111cd5c(2015-02-20 11:40:59) Information: TaskID=6fc1dd6c-dc88-4f21-a926-d5701e23aa24(2015-02-20 11:40:59) Information: MaxRunTime=06:00:00(2015-02-20 11:40:59) Information: Max attempts:1(2015-02-20 11:40:59) Information: Current Attempt=0(2015-02-20 11:40:59) Information: Task Dependencies are OK(2015-02-20 11:40:59) Information: Document is marked to be Reloaded with fresh data. Initializing Reload for Distribution.(2015-02-20 11:40:59) Information: Opening "D:\QV-ROOT\MARKET MANAGEMENT - PARTICULIERS\05-APPLICATIONS\BOOKS\Book - Particuliers.qvw"(2015-02-20 11:40:59) Information: Allocating new QlikView Engine. Current usagecount=1 of 2(2015-02-20 11:40:59) Information: Max retries:5(2015-02-20 11:40:59) Information: Attempt:01(2015-02-20 11:41:00) Information: Allocated QlikView Engine successfully. Current usagecount=1 of 2, Ticket=4(2015-02-20 11:41:00) Information: Loading document "D:\QV-ROOT\MARKET MANAGEMENT - PARTICULIERS\05-APPLICATIONS\BOOKS\Book - Particuliers.qvw" (8259.87 Mb)(2015-02-20 11:41:01) Information: Loading. LoadTime=00:00:01.0140065(2015-02-20 11:41:03) Information: Loading. LoadTime=00:00:03.0420195(2015-02-20 11:41:05) Error: Document open call failed. The document might require username and password.(2015-02-20 11:41:05) Information: Attempted to load the document without data.(2015-02-20 11:41:05) Error: The document failed to open.(2015-02-20 11:41:06) Information: Closed the QlikView Engine successfully. ProcessID=5012(2015-02-20 11:41:06) Error: Document could not be opened(2015-02-20 11:41:06) Information: Closed the QlikView Engine successfully. ProcessID=5012(2015-02-20 11:41:06) Information: Failed to check in document: D:\QV-ROOT\MARKET MANAGEMENT - PARTICULIERS\05-APPLICATIONS\BOOKS\Book - Particuliers.qvw(2015-02-20 11:41:06) Error: The task "Market-Management-Particuliers-Book/Book - Particuliers.qvw" failed. Exception:(2015-02-20 11:41:06) Error: QDSMain.Exceptions.DistributionFailedException: Distribute failed with errors to follow. ---> QDSMain.Exceptions.ReloadFailedException: Reload failed ---> QDSMain.Exceptions.FailedDocumentCheckoutException: Failed to check out document with path: D:\QV-ROOT\MARKET MANAGEMENT - PARTICULIERS\05-APPLICATIONS\BOOKS\Book - Particuliers.qvw(2015-02-20 11:41:06) Error: at QDSMain.ReloadTask.Reload(String fileName, TaskResult taskResult, String sectionAccessUserName, String sectionAccessPassword, eReloadOptions reloadOption, String variableName, String variableValue, Boolean moniterCpuUsage)(2015-02-20 11:41:06) Error: --- End of inner exception stack trace ---(2015-02-20 11:41:06) Error: at QDSMain.ReloadTask.Reload(String fileName, TaskResult taskResult, String sectionAccessUserName, String sectionAccessPassword, eReloadOptions reloadOption, String variableName, String variableValue, Boolean moniterCpuUsage)(2015-02-20 11:41:06) Error: at QDSMain.DistributeTask.Execute(TaskResult currentTaskResult)(2015-02-20 11:41:06) Error: --- End of inner exception stack trace ---(2015-02-20 11:41:06) Error: at QDSMain.DistributeTask.Execute(TaskResult currentTaskResult)(2015-02-20 11:41:06) Error: at QDSMain.Task.AbstractTask.TaskExecution(ILogBucket logBucket, TaskResult taskResult)(2015-02-20 11:41:06) Information: Task Execute Duration=00:00:06.4116411(2015-02-20 11:41:06) Information: TaskResult.status=Finished(2015-02-20 11:41:06) Information: Notifying all triggers of new state:FinishedWithErrors(2015-02-20 11:41:06) Information: Notifying all triggers of new state:FinishedWithErrors - completed(2015-02-20 11:41:06) Information: Saving Task Result
Agreed, I still think it looks like section access, but hey, let's wait and see!
One of our customers gets the same error.
We've checked user- and server-properties.
The only thing we found was the dateformat in the QMC.
When you take a look at the task historie there are dates like 27.02.2015 and 03.09.2015.
Is it the same in your case?
HI
in my task history the dates format is like dd/mm/yyyy .
Hi Mohamed,
Is your issue resolved...as I am facing he same issue in my environment.
Please help?
HI Eriika,
i have put it aside for the moment ,certainly i will back on this issue shortly .
have you checked the script code? did you found any security part inside ?
let me know
It may be phantom error and may really be the cause of reaching "Max Concurrent Reloads".
When I click on "start reload" when there is a lot of task running, I often get this message. There is no "section access" in these documents.
Hi,
I'm not sure if this is answered,
Ahid just make sure Initial data reduction based on section access is unchecked and all the others that are highlighted in the image is unchecked.
Regards
ASHFAQ
Just posting to link to the other two threads on the same subject. None of the posts presently have a solution on them that has been marked correct.
https://community.qlik.com/thread/153448
https://community.qlik.com/thread/61952
https://community.qlik.com/thread/74218
I have previously tried most of the things referenced in the threads where the problem has been happening intermittently at a client. Am giving the increase of available engines a go now - but I have other sites where when this limit is reached tasks just wait rather than fail, so I am not sure it is the solution.
Hi Andrei,
In your case it is because you need to increase the DeskTop Heap or lower the number of reload engines.
Bill
We have found that increasing the desktop heap size seems to help this issue (same as what Bill said above).