I am running a specific report which is fetching data from Oracle database. The issue is that the reload fails after exactly 1 day, on both the server & on QMC. The timeout in minutes under triggers in QMC has been set to 4000. Please suggest if there are any checkpoints which might be the reason for the report to fail after exactly 1 day? and also, could there be a setting on the database side to disconnect any connections which has been up for more than a day?
Yes, we usually perform an incremental load on the QVD Generator. However, in some cases we will have to do a full load and we have huge amount of data. So yes, the reload takes more than 1 day in this case.
The break might be caused from the oracle itself or the network (respectively some security tools/measures in between).
Personally I wouldn't look in this direction else rather slicing the task into smaller parts. Beside this there might be some potential to optimize the queries ...
I agree with Marcus.
Why not slice the data from the query? Then you can concatenate multiple reloads.
I would also look into the query. Does it have too many joins? If so, why not eliminate the joins and load each table separately. Then you can join tables in Server memory using the extractor or some other data warehousing solution.
Also you can consider making an application with Java, python or C# that loads all your data into chunks and then stores them into multiple excels, from which QlikView loads.
let us know what works for you and if you need any further help or support.
Thank you for the response.
This is purely just QVD Generator. We are not performing any joins, it is just fetch and store in QVDs. And regarding slicing the data- we usually run it for 2 years (2018, 2019), and because the reload is taking more time, I tried to run the reload for only 2019, and like i said because of the huge amount of data it requires more time to fetch data from just 1 table & for just 1 year.
@marcus_sommer I was guessing it might be a restriction on the Oracle side, However, i had to confirm on QlikView side before reaching out to the concerned folks on Oracle side. And as per your suggestion, i think i need to contact the database guys for any help on this. I will do so.
Thank you again.
If the data for one year are such huge it might be better to slice them on YearMonth level.
Further take a check which data are really needed from Oracle and if they could be optimized in any. In the case that your loadings are just: select * from table; there are quite probably a lot of opportunities, for example:
- only loading fields which are used in Qlik for any matchings/calculations (sql tables contain often various internal ID's and flags which aren't helpful on the outside)
- splitting timestamps into separate date- and time-fields
- removing formattings and/or converting fields
and there are surely some more measures which could reduce the data-size on the Qlik side significantly and it must not mandatory mean that such calculations/transformations increase the run-times because I assume the biggest bottleneck rather on the db- and/or network-side.
Divya, this one is actually easy! 🙂 You are missing master max time setting adjustment, once you set that, things should be able to run as long as you need them to run.
Here is a link to an Article that should take care of things for I believe: