Is your QV application has a separate QVD loader application.
If not you may separate the application into two level. One is to load data from the source to Qlikview form (QVD) with all possible transformations.
Second one will be your main application to fetch data from the QVD loaded in the level 1 with the visualization in place.
Then we will need to schedule the main application reload on completion of the reload of QVD loader application.
Please let us know otherwise.
If you turn on the reload log of the document, you will be able to see which steps are taking a long time to complete. On Document Properties | General, check the "Generate Logfile" option and perform a reload. The document log os a text file in the same folder and with the same name as the document, but with a .log extension.
If this takes too long, make a copy of the document and open it in QV desktop (you will need a PC/server with sufficient RAM). Click CtrL-E to go to the script editor and select the debug option in the toolbar. Check the limited load option and select a maximum number of rows to reload. Start with a low number, say 5% of the actual amount of rows. Then click Run to perform the reload. Then examine the reload log. Repeat this and Increase the number of rows to a large enough number to see the problem.
Then focus your attention on the long running portions of the load.
170MB constitues a rather small-to-medium-sized document.
In QMC task definitions (with or without Publisher), there will be an input field to enter the maximum delay a reload task can last. If this limit is reached, the Distribution Service will kill the reload job and the previous version of your document will be available instead. See QMC->Documents->User Documents->Your document->Reload->Timeout seconds (no Publisher) or QMC->Documents->Source Documents->Your document->Task->Triggers->Task Execution Options->Timeout in Minutes.
The most probable cause of slow reloads is that your load script simply requires that amount of time to do what you tell it to do. Examples include JOINS with excessive memory requirements, multiple reads from large external tables, slow data source connections (e.g. SAP), nested FOR loops that execute 10000+ times, badly written LOAD statements, etc.
As Jonathan suggested, you should first investigate where exactly your script is spending an inordinate amount of time. Then you can try to optimize that phase, or post that part of your script in this discussion if you want help.