Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
datanibbler
Champion
Champion

qvd file blocked - can I somehow prevent that?

Hi,

the new report I have built is supposed to be run locally  by the users whereby of course they use the qvd file.

It seems to me that the qvd file is being blocked as long as an app supposed to read it is open - even if it's not being read, can that be?

Anyway, our primary load_script runs once an hour and at that precise time, if the qvd file is blocked and cannot be written to, of course there is an error and the whole thing fails.

Is there any way I can prevent that?

Thanks a lot!

Best regards,

DataNibbler

22 Replies
datanibbler
Champion
Champion
Author

Thanks Piet!

I've had other situations before where, out of a rather long list of files, I had to find the newest one, using some part of the filename, so I know roughly how to do that.

For now, I'll try my idea. But if that doesn't do, I will try yours - I think I could just alter that load_script to write a file with  no suffix and, after the load_script has finished (as the very last command), I  could use an ECECUTE cmd command to add the suffix _old to it (and delete the existing one, or just overwrite it) and that report could just always read from the qvd file with the suffix _old. That way the report would never end up reading from the same file that my loading_script needs to write to.

Best regards,

DataNibbler

datanibbler
Champion
Champion
Author

Oops - indeed it doesn't seem to work - very strange, though: I am loading the filesize of the log_file, which is 0 while the script is running, and the first time I tried this a NULL value was loaded - but now (the script is currently running and the filesize such as I see it is 0), the value is loaded as the size it was last time when the script was finished ...

I cannot quite make head or tail of this.

But I could now try - no, Piet's approach won't work either - the loading_script on the server would have to run a command_line to rename the file (with the suffix_old) and thus overwrite the existing _old file - but programs running on the server are not allowed to use the command_line 😉

I would have to implement a code into the report (which is run locally) to do essentially that - but then there could be conflicts again if the load_script tries to write to the new file and at the same time, the report tries to rename it ... only, in that instance, the order_of_priority would be reversed and the renaming_process would fail because the file is blocked by the writing_process.

Oh my, this is complicated ...

datanibbler
Champion
Champion
Author

Ah - I've got another idea 😉

I can query the time the qvd_file was last generated - and since I know that the load_script runs once an hour, it is pretty sure that between the creation of one qvd and the next, there will be one hour (plusminus), right?

So I can query whether the current timestamp is 55min or so bigger than the creation_timestamp of the existing qvd and if it is, I can say that the relevant part of the loading_script is probably currently running or just about to run and the user should wait a few minutes or maybe even close QlikView for a few minutes ...

datanibbler
Champion
Champion
Author

Hi,

finally it works.

What could still happen is that the report is open and the user is currently not at his desk when this warning appears, so they cannot close it - is there any possibility to close QlikView altogether from the script?

Best regards,

DataNibbler

datanibbler
Champion
Champion
Author

Oops - I was too early in being happy 😉 It doesn't work yet - I can't quite understand why, the data_types are matching.

The code I use for finding out whether the relevant part of the data_loading script is currently running is this:

FIRST 1 LOAD
  TIME(filetime('$(v_Pfad_TransDtl)')) as Letzte_Generierung, // (that is a filepath_variable pointing to the qvd file)
  MINUTE(TIME(NOW()) - TIME(filetime('$(v_Pfad_TransDtl)'))) as Zeit_seither_min,
  HOUR(TIME(NOW()) - TIME(filetime('$(v_Pfad_TransDtl)'))) as Zeit_seither_hr
RESIDENT Werkscheck
;

// The data_loading script runs every hour, so the time between one updating of the qvd and the next is always equal.
IF num(Zeit_seither_min) >= 55 THEN
   LOAD
       MsgBox('Die Archivierungslogik in der LOAD_AOS muss in Kürze auf die Datendatei zugreifen. Bitte QlikView ganz schließen, in ca. 10min kann es weitergehen.', 'Fehlschlag', 0, 64, 0) as Meldung
     autogenerate 1;
   EXIT SCRIPT;
ENDIF

EXIT Script;

(I first had a bigger construct to differentiate between the two plants, but for now I only want to get that query going for the one plant where that load_script runs every 60min.

The datatype of that field seems to be numeric all right, but still - the code runs and a similar query on the GUI does work, but this doesn't, the MsgBox does not appear even when that field is 57 or 58.

Can anyone spot the error in that code?

Thanks a lot!

Best regards,

DataNibbler

P.S.: OK, whatever, I now made that into a variable using PEEK() and that works.

marcus_sommer

Hi DataNibbler,

you couldn't access the field "Zeit_seither_min" directly - you will need a peek for this:

peek('Zeit_seither_min', 0, 'Tablename')

I assume it's the story about "... der Wald und die Bäume ..."

- Marcus

datanibbler
Champion
Champion
Author

Hi Marcus,

yep, you're right - as I just wrote, it works with PEEK() - just found it out, you were faster 😉 I was looking at a lot of trees - trying that >>_Inuse!!<< file first and the filesize of the log then - that was quite a long jog ...

Now the only thing I wonder is - would it be possible from the script to close QlikView altogether - ah, no, to do that, the script would need to run. If a user is not at his desk and the report is open, the script cannot run, so that wouldn't do - but I guess that should not affect the qvd and the updating by the LOAD_AOS should succeed.

This whole thing is done then, the thread can be closed.

Best regards,

DataNibbler

marcus_sommer

Normally there should be no blocking of qvd's and if this happens (and it's ensured that there is no parallel running qv-task) it will be caused from some conflicting of the OS (filesystem) and Qlik and further processes like windows shadow copies (take a look on the windows-logs) or any scanners - maybe you could convince your IT to exclude qvd's from any scanning (it would reduce the number of possible causes and rather improve the performance).

- Marcus

datanibbler
Champion
Champion
Author

Hi Marcus,

unfortunately this issue just resurfaced - there can be conflicts on other qvd files that are used by the report, too, and it just happened, again causing one rather critical data_loading script to fail.

Convincing IT of anything will be rather difficult since QlikView is not even officially supported by IT - and anyway, IT does have its own ways in many ways here ...

But I have another idea to exclude any possibility of read-vs-write conflicts:

I cannot run a batch file on the server - but locally that should be possible, so the report should be able to use the EXECUTE command to create a copy of the qvd file and load that ... I just have to see how long it takes to create a copy of the qvd file - if it takes too long, the report has to first check if a copy exists ...

P.S.: Hmm ... but still the report should always use the most current data file available, so I have to check the filetime() of that qvd_copy against the filetime() of the "original" qvd, and if the latter one is newer, it has to be copied again - or I can just always have it copied, that way I don't have to check anything - but then there might again be conflicts in case the app wants to copy the file just in the same instant that the loading_script wants to write to it ...

I guess that in the last consequence that risk cannot really be excluded when QlikView apps are used both locally and on the server. We'll just have to live with it.

marcus_sommer

Hi DataNibbler,

without a real access to the server you will be always cure the effects and not the cause ... (I know that you know it) ... and therefore it will be difficult to find a stable solution. I think at least you will need access to the log-files in any way.

- Marcus