Qlik Community

QlikView Deployment

Discussion Board for collaboration related to QlikView Deployment.

Not applicable

source data readiness and reload tasks

The major data source for our qv apps is an oracle data warehouse. It pulls data from other systems, massages, transforms, validates, etc....

I cannot pull pull any data from that database until ALL processing is complete. Fortunately, it is easy to determine when it is done. I can issue a query and return a "Y" or "N" (or whatever).

The "database ready time" is variable, usually 3-6AM, downtime notwithstanding.

We have 20+ qvd builders that reload from this database. Theoretically, all of them could run in parallel as they are unrelated. Practically, our QV server cannot handle it.

Here is my question. Is there a way to set all 20+ tasks to "jump in line" when the source database is ready and QV will work through them as fast as possible ?

Ideally, they would:

  • begin loading as soon as the database was ready each day
  • load independently of each others success or failure status

I have toyed with setting them to run multiple times beginning at 3AM and forcing a failure when the "is database ready" query returns "N". That is kind of messy (errors in QEMC, Event Log, QV Logs).

What I HAVE NOT done is set a single task to issue the "is database ready" query and setting all 20+ tasks to be dependent on success of that tasks.

Any thoughts or past experience would be greatly appreciated.


2 Replies
Honored Contributor II

source data readiness and reload tasks

Have you considered to use ordinary batchfiles?

Triggered continuously by windows scheduler,
starting an inspection-application (QV), which at the end produces a file if update is finished
if the file exists (or even inspect it) continue with batch-excution (qv.exe /r yourapplication.qvw)
if not - terminate and wait for the next trigger from scheduler.

note: applications initiated in a batch should have SET ERROMODE = 0, in order not to stop on errors.

You may construct similar also w/in QV (via EXECUTE), but personally trust more on batches.


Not applicable

source data readiness and reload tasks

Thanks Peter,

What you mention is (almost) exactly what I have in place now and it works great. I use an oracle package and a few powershell scripts to manage 100+ reloads and publish events. I spawn concurrent reloads based on data readiness, available RAM/CPU and check the log files for errors.

The reason I'm trying to find a qv solution is that our qv developer team and application set is growing. Everything new has to come through me and any issues can only be resolved by me. I don't scale well :-) and will soon be a bottle neck/single point of failure. Not good for me or my team.

If the QV scheduler simply cannot do anything like this (hence the current process) then we have to come up with another plan.

I know we are not the only ones with this "data readiness" issue so I thought I'd kick it out to the forum.