Skip to main content
Announcements
Qlik Connect 2025: 3 days of full immersion in data, analytics, and AI. May 13-15 | Orlando, FL: Learn More
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Comparing application deployed in two different servers.

Hi All,

We have all our QlikView applications deployed in a new server which were residing in the old server. I just need to check the deployment is successful, i.e the applications in the old and new server are same and contain same data. Is there any way to do it using any kind of script. Or to what extent I can automate this process.

21 Replies
marcus_sommer

Within the load-examples here: Symbol Tables and Bit-Stuffed Pointers - a deeper look behind the scenes you will find a script which looped through all tables and all fields of an application - you could just remove the most parts from those loadings and just counting the records and fields per noofrows() and nooffields() and something like this.

And here are various examples of exporting objects to excel: https://community.qlik.com/search.jspa?q=excel+export&type=document.

- Marcus

Anonymous
Not applicable
Author

Hi marcus_sommer‌, thanks for the links and that's a very good post you have for symbol tables and Pointers.

But Marcus, here I need to compare data from applications on two different servers, to run your script to loop through tables, don't I need the data of .qvws(to be compared) in the memory of  (using to compare) application?? For that, if I have to choose binary load which is the only way I know,

There are a few key things to note about the Binary Load statement:

1. we can only use it as the first statement of a load script.

2. There can only ever be one binary load per .qvw.

Please correct me if I  am wrong and help me out.

marcus_sommer

A possible method could be to build a control-application which looped through all your environments and folder-structures and open a prepared reading-application and creates there the load-statement (the binary part of path and filename from the loop + the additional tables/fields reading from a variable - the export-routines are already included within the prepared app) and triggerred them and runs in this way through all (wanted) applications and if it's finished this app will be closed.

Afterwards could the control-app load all the output from the above mentioned routines to check them against eachother.

For setting the script and reloading the application you will need to use macros whereby they aren't very complicated and the APIGuide.qvw helped a lot to get them run, for example to open an application, set the script and reload:

rem ** VBScript **

set App = ActiveDocument.GetApplication

set newdoc = App.OpenDoc ("C:\MyDocuments\QV4Automation.qvw","","")

rem ** add new line to script **

set docprop = newdoc.GetProperties

docprop.Script = docprop.Script & "Load * from abc.csv;"

newdoc.SetProperties docprop

newdoc.DoReload 2,false,false

'export-routines ...

- Marcus

Peter_Cammaert
Partner - Champion III
Partner - Champion III

Various interesting suggestions have been made for the ad-hoc checking of document compatibility in different environments.

This one may be a suggestion you could use for future checks. It's also usable for preforming initial validation on document changes that may have a considerable impact on your results, or for performing validation checks after making changes that shouldn't have any impact at all, except in speed and memory consumption (e.g. the results should be the same). Is consists of the following steps. Change them according to your liking:

  • Add a new sheet at the end of the tab row
  • Add a conditional show expression to this sheet, so that it is displayed only to a select group of people (testers, developers, key-users performing acceptance checks, etc.)
  • On this sheet, place a series of essential aggregation objects and various counters of distinct and total field values, range checks, # of table rows, period based counts and sums, checks for values that should/should not occur (red/green lights) etc. You can extract design guidelines from the checks you perform as a developer, or from your testers that perform these checks anyway before a new version goes to production.
  • Start with an initial set of checks, and include each new check you discover while maintaining and testing this document.

You don't have to create this sheet from the start as a fully equipped testing dashboard, you can let it grow over a period of time. The investment will be low at the beginning, and the rewards will be high.

Also note that many existing user-facing objects alread provide a decent way to check the correspondence between a -previous version and a new version of your document, so you do not need to duplicate those. This could be an initial verification for you when you want to be sure that the document in the original environment produces the same data as the document in the new environment: check the end-results. If after a full reload (all data stages) they are identical, they probably match. If not, then you know some investigation is required.

The idea is that these tools won't cost you a lot of work (you create them anyway and often they're thrown out at the end of a development cycle), and they'll be available when you need them.

Best,

Peter

Anonymous
Not applicable
Author

marcus_sommer‌ sorry that I didn't completely understand how to do this part

"and open a prepared reading-application and creates there the load-statement (the binary part of path and filename from the loop + the additional tables/fields reading from a variable - the export-routines are already included within the prepared app)"


Can you please help me with some more details if possible.

Anonymous
Not applicable
Author

Thanks, peter, this definitely goes on my checklist while building applications from now.

marcus_sommer

It meant you creates a new empty application and insert the macro-routines which will later export your gui-objects ... on this part I notice that I have a gap within my above mentioned logic because you won't have the objects within this application when you are loading the datamodel per binary. This meant there is a small adjustment necessary in this way that those routines will open the original app and export the objects from there.

Like above mentioned the efforts to create a fully automated check of the data and all objects will be quite expensive and I don't think that I would use such approach.

I would rather go with the suggestion from Peter (and I have already admin-sheets included) whereby I try to keep my applications simple and don't overuse variables + set analysis + dimension/expression groups + alternate states + any kinds of actions and so on so that the risk of different gui-results by identically data are quite small. For the most parts of my stuff I would me feel safe by checking a few main KPI's within a table-chart.

- Marcus

Anonymous
Not applicable
Author

Yeah, Marcus, I should agree with you but just that I have around 40 .qvws to compare and I can't edit the applications, I just need to get if they are different.:)

marcus_sommer

Couldn't you just make a copy of them and compare they on this basis?

- Marcus

Peter_Cammaert
Partner - Champion III
Partner - Champion III

Ideally 40 different QVW documents means forty differen key-users. How about involving them in this upgrade? They already know what to look for. And distribution of effort may increase the quality of the checks and the speed with which you receive feedback...