Skip to main content
Announcements
Intermittent issues logging into the Qlik Community. We are working toward a resolution.
QlikProductUpdates

Qlik now offers capacity-based pricing and packaging for Qlik Analytics - building upon the rollout of the Data Integration capacity model in Q2. Starting today, our customers can select a pricing tier that best suits their needs around data movement and data analysis.      

What does this mean for you?  

Capacity pricing removes restrictions around data sources and users, allowing for greater value around data loading and analysis. This new pricing model uses Data as the primary value metric and is designed to empower you to better leverage your accessible data sets and support more analytics use cases with the use of AutoML and GenAI capabilities in Qlik Cloud. The structure of our new model is based on extensive market research and customer feedback, with the goal of providing greater flexibility and predictability when using Qlik Cloud. You can now subscribe to pre-defined data packs at a fixed monthly cost while making it easier to adopt the full set of cloud capabilities. All of this, with embedded telemetry, organizations can understand and monitor their data usage.   

Capacity pricing offers the following benefits:  

  • Simplified pricing with data as the primary value metric that is aligned with customer needs and ease of doing business with Qlik
  • Expand user access to support more analytics use cases within your organization
  • Drive Capability Adoption: access to key cloud capabilities and platform value drivers for faster time-to-value
  • Provide Data Usage Transparency: insights into data utilization and telemetry within Qlik Cloud

Can't see the video? Watch it here

What are the new pricing tiers?

Understand the details of Qlik’s capacity pricing and packaging options on a new pricing page on Qlik.com showing the list prices of each plan. New customers can choose from three distinct plans when doing business with Qlik: Standard, Premium, and Enterprise. And existing customers can move to the new pricing model on their own timeline and can decide what tier is best for their needs.  Work with your account team to explore migration to this capacity model.

 

41 Comments
StefanBackstrand
Partner - Specialist
Partner - Specialist

@E_Røse Well, the size of the table and the file will be dependent on the data modeling and what you join into the table.

Are you saying that a 1:1 straight up table from snowflake loaded into and stored into a QVD is x2 the size of the data in the Snowflake table?

0 Likes
1,039 Views
E_Røse
Creator II
Creator II

@StefanBackstrand 

"Are you saying that a 1:1 straight up table from snowflake loaded into and stored into a QVD is x2 the size of the data in the Snowflake table?"

Yes, that's what I am saying. My example here is a quite standard Fact table with ~60 million  financial transactions. (I am not looking for optimization tips here - i am just saying be cautious before stating that a large columnar data warehouse will easily fit into the capacity limits because of qvd's.)


0 Likes
1,015 Views
StefanBackstrand
Partner - Specialist
Partner - Specialist

@E_Røse Ok, maybe you should? x2 QVD size vs its table either means a grossly ineffective model or some other problem. 🙂

981 Views
sathish_sampath
Employee
Employee

Hi there, 

Here is the link to the online guide: https://help.qlik.com/en-US/cloud-services/Subsystems/Hub/Content/Sense_Hub/Admin/subscription-optio...

With regard to the size, data stored in snowflake is in a compressed format. When you are bringing in the data from snowflake, we take the size of raw data(uncompressing what is in Snowflake) and that size is used for calculation purposes. 

 

962 Views
E_Røse
Creator II
Creator II

@StefanBackstrand 

"Ok, maybe you should? x2 QVD size vs its table either means a grossly ineffective model or some other problem"

That depends on to what extent you consider the free text fields related to financial transactions  a problem or a usefull feature.  And this is not at all relevant for the question I had about computing the used capacity, so can we please leave that discussion for another time.

 

 

 

 

944 Views
StefanBackstrand
Partner - Specialist
Partner - Specialist

@sathish_sampath Even in the case of using Qlik Cloud (QVDs) as Data Platform? That is not what I've been told before.

This is what I've been told:

- If using Qlik Cloud (QVDs) as Data Platform in QCDI, Data Moved is not counted against quota - only resulting QVD files on disk (storage) in Qlik Analytics are counted against quota.

- If using any other Data Platform, like Snowflake, the data is counted against quota the way you describe: total amount moved.

Is this correct?

It seems to be confirmed by this help page: https://help.qlik.com/en-US/cloud-services/Subsystems/Hub/Content/Sense_Hub/Admin/mc-monitor-consump... under "Data for Analysis" and "Data Moved".

 

916 Views
sathish_sampath
Employee
Employee

You are right, Stefan. That's what I had mention in one of my earlier responses as well. Data Moved is not counted when target is Qlik Cloud. 

868 Views
E_Røse
Creator II
Creator II

@sathish_sampath @QlikProductUpdates I did not really get an answer to all my question,  so I will try to reformulate. 

This is all regarding how to measure the Data Analyzed, and using qliksense apps to load data from external sources

1) Say I have an app that creates a qvd of 1 gb. The qvd  uses 1 GB of the quota. If I understand your help page correctly, the "peak application reload size" also counts towards the quota? What is peak application reload size refering to - in memory size of the app? the qvf file size? something else?

2) In the product description it says that "Data Analyzed refers to data loaded into Qlik
Cloud from an external source or created in Qlik Cloud, and includes all data being analyzed, profiled by the catalog, or landed into Qlik Cloud." How do you measure the size of the data profiled by the catalog?

3) Is it the sum of peak application reload size, data files and profiled data that counts towards the quota or the max of peak application reload size, data files and profiled data?

4) Are the user based subscriptions still available for new customers?

751 Views
sathish_sampath
Employee
Employee

Hi there, 

The intent of "Peak application reload size" is that when there are multiple reloads happening on the same day, the highest size of the app is the one that will be taken for consideration with regard to calculation of "Data for analysis". For ex: if you have an app that is 1gb, and if you reload few times in the day that results into the size of the app as: 0.75gb, 1.25gb, 1.10gb, the 1.25gb would be taken for consideration. 

Hope this helps, 

-Sathish

688 Views
AdamBS
Partner - Creator
Partner - Creator

I think I miss something, but I can't find a reference to "peak application reload size".  I did see a slide where this was indicated as a measure on Apps Uploaded to Qlik Cloud. 

If we only consider the App portion of Data Analyzed, what is the data analyzed size based on?

  • It is not the RAM footprint (am told).
  • It could be the size of saved App?
  • It could be the peak reload size?
  • It could be the byte count of data flowing into the App during reload (excluding Data Moved)?

If it is peak reload size, it is a concern to me because over the years I have seen Apps using a lot of RAM during reload, with a resulting RAM footprint that is significantly smaller. (see Apps with significant aggregation, or intervalmatch() or poor "drop table" strategy.)

677 Views