Skip to main content
Announcements
Defect acknowledgement with Nprinting Engine May 2022 SR2, please READ HERE
cancel
Showing results for 
Search instead for 
Did you mean: 
martin_escobar
Contributor II
Contributor II

Accelerating Publishing task in NPrinting 17.6

Hi All,

I am new to this community so this is my first question/idea.

In our company we are using Nprinting (ver 17.6) and recently we created a general report for daily distribution in HTML so it can be embedded in the mail body. No problems there, it works as expected.

The Issue I came by is that the publishing task is taking on average 90 minutes.  The challenge I am facing (and probably some of you too) is how to accelearte the publishing task.

I have read some threads in this community talking about this problem and some good practices, so I want to share my own experience around this issue and a solution I found (not too elegant, but functional at the end)

As I mentioned, we have Nprinting 17.6 and QlikView 12.1 running together in a vMachine with 8vCPU and 32GB Ram.

The app for this report was connected to 2 Documents (Shop Floor and Supply) via Server. The first time I did this was directly to the documents and reloading the metadata took forever. After a few months and testing I learned that it is a good practice to have a separate document with only the objects being included in the Nprinting template. So I created new versions of our QV documents using Binary (Shop Floor Binary, and Supply Binary). The most notorious improvement with this change was reloading the Metadata.

I have 2 apps for the same report. The first app is called "HTML Report Live" and the second "HTML Report Test". The publishing task is performed to the live version that has server connections to the Binary reports. The test version has direct connections to the same documents. So my solution here was to change the connections in the live version to direct connections, and create triggers for reloading the metadata for both documents before publishing the report. This is taking overall 5 minutes total including the metadata for both documents and the HTML report publication. Big Improvement!! 

Still I am not very happy with this solution, NPrinting doesn't let you program tasks in cascade as the QMC, so each trigger has to be scheduled in different hours.

Any thoughts around this issue and the solution I have so far?

Best regards to All, !

Martín.

4 Replies
glacoste
Creator
Creator

In my experience:

- Nprinting and Qlikview need diferents machines. NPrinitng use the max resource availables in the machine. This affects the performance of qlikview server if it uses the same machine.

- From NPrinting machine create a UNC connection to the qvw file, is much faster compared to QVP. This is because it uses the resources (RAM, CPU) of the nprinting server instead of the qlikview server that will normally be with more requests in progress.

- I f you can add more vCPU to the machine this can increment notably the performance. Thing the follow: nprinting open a qvw instance of the file for each cpu and use this in parallel.

- For the triggers see https://community.qlik.com/thread/292037?sr=inbox&ru=187069 this is for Qlik Sense but work for Qlikview, basically you can call reload task from a qlivkiew document using the nprinting api and the qlik rest connector. So, you can make a qvw aonly for call the nprinting reload task and config this in qlikview for reload when you qvw i success

Lech_Miszkiewicz
Partner Ambassador/MVP
Partner Ambassador/MVP

Gustavo

I agree with you on first and last of your bullet points but second and third i strongly disagree with.

Regarding second point i can agree that generating reports using local connection works faster BUT!!!

  • you loosing server management performance
  • you need to make sure that both machines (Qlik and NPrinting have enough resources to open and havily explore not one - but many instances of the same QV document)
    • as an example: if you use NPrinting LOCAL connection to a QV document which then opens up on NPrinitng server and takes up say 2 GB of ram just to open it; It can happen that with say 8 CPUs on board NPrinitng will try to open 8 instances of your QV.exe which in this case will kill the box since you will run out of the memory. Since these are local (desktop) clients there is no memory management and your NPrinting box simply will stop responding. I came accross this mistake so many times with clients who tired to do NPrinting by themselfs and after few days came back to me asking why on earth nprinting is so unstable ans hangs.
    • so on this note if your Qlik app is very small - yes - Local connection may be a solution, but big application can and will kill your NPrinitng box sooner or later
    • I strongly recommend QVP connection - and yes - report generation can be slower with it, but long term it will pay off. NPrinitng team seems to keep improving it all the time so hopefully it will work better soon.
  • the 3rd point is valid only if you add no more then 8 cores. Help document clearly says what happens if you add too many cores to your NPrinting setup - IT WILL DROP PERFORMANCE as your tasks can start clashing with each other.
  • regarding last point - agree that publish tasks are available via API but with LOCAL connection you also need metadata reload trigger?? i am not so sure if API is allowing you to do this too- you would need to check this.

@Martin - what i can add to it to improve your performance is:

  • minimize all your objects in your QV document - this will improve calc time and any on open action
  • make sure you drop all fields from your datamodel which are not used in NPrinting
  • maybe update to the latest NPrining version - there are significant improvements there also in preformance (or maybe wait a month fo July version (already available in Technical Preview version)
  • looks like you have done great homework with other things - so keep us posted how it goes

regards

Lech

cheers Lech, When applicable please mark the correct/appropriate replies as "solution" (you can mark up to 3 "solutions". Please LIKE threads if the provided solution is helpful to the problem.
martin_escobar
Contributor II
Contributor II
Author

Thanks Guys for your comments!

-Martin.

martin_escobar
Contributor II
Contributor II
Author

 

Hi Guys,

  I followed your recomendations. Now I have diferent machines for each server and it is working well. Also I am using server conections.

Thanks.

-Martin