Skip to main content
Live today at 11 AM ET. Get your questions about Qlik Connect answered, or just listen in. SIGN UP NOW

Scaling the QlikView Publisher

100% helpful (1/1)
Showing results for 
Search instead for 
Did you mean: 
Digital Support
Digital Support

Scaling the QlikView Publisher

Last Update:

Mar 24, 2023 5:07:51 AM

Updated By:


Created date:

May 22, 2019 6:35:42 AM


Wondering how many engines to allow? How to scale the QDS cluster and identify bottlenecks? What configuration fine-tuning may be necessary? The below whitepaper is able to provide you with basic best practices, which can be expanded on further by an engagement with Qlik Consulting Services. 

We've compiled configuration guidelines on how to scale the QlikView Publisher (also known as Distribution Service and QDS).

Note!  Advice collected in this paper is given based on best practices put together by Qlik Architects and Support Engineers, and while providing a good baseline may need adjustments depending on individual environments. These adjustments would need to be discussed with a Qlik Consultant.


QlikView Publisher is the Reload & Distribution engine in a QlikView deployment. Using QlikView Publisher, you can automate the generation of data stores to be consumed, the reload of fresh data and the distribution of QlikView documents via email or to a specific QlikView Server or directory folder. In the QlikView Management Console, the Publisher is referred to as the QlikView Distribution Service, which is also the name of the Windows service managing the role. As a service, it can be clustered over multiple Windows server nodes.

Content covered:

  • What is a QlikView Publisher?
  • Horizontal vs Vertical Scaling
  • Identifying Bottlenecks
  • Best Practices
  • Optimizing Stability and Performance
  • QlikView Distribution Service .config File
  • Publisher Groups
  • Overload Protection

Download the Scaling the QlikView Publisher from this article. 

Labels (2)
Version history
Last update:
‎2023-03-24 05:07 AM
Updated by: