Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
hakkthan
Contributor
Contributor

Dynamically Filter Data before cache app in memory

Hello guys.

I have a major problem with the size of my data sources and I need an expert to inform me if my idea , that I will describe below can be applied or not.

I have 2 tables with data with more than 15 million event lines each that need to be associated (or joined within Load Data Editor). I have 3 columns that I need to be able to filter on my visuals, I create an extra dimension I need and I have 3 measures with rangeSums to present on my sheet (visuals).

Here is my problem:

When I am on Data load Editor, I write the script in order to upload my tables as follows:

Connect onedrive data,

load tables,

store them as qvd files

drop initial tables

load them again from qvds

and try to run the script (press Load Data), the system returns a memory limit error.

I know that I can increase our subscription RAM limits, we are not willing to do so, it is very costly.

So, I am thinking if there is any way I can create visual filters that will ask the script in load editor to filter dynamically my qvds and then upload them in RAM in order to make all the calculations and present the results.

Could this be possible with qlik cloud Business Saas?

Do you think this is the right thing to do?

Labels (3)
1 Reply
Dalton_Ruer
Support
Support

Oooooh interesting situation. 

Solution 1: You may well simply have way more data that you are trying to load than your subscription will support. However, there are some things that you might be able to do. If you have peoples names, consider breaking them into First and Last. If you have phone numbers, consider breaking them out. If you have full addresses, consider breaking them out into Number and Street. Or into City  State  Zip. Qlik will only retain the unique rows for each column. "Tom Smith" and "Tom Jones" will yield 2 rows of the full names. But if they are broken out you will have 1 first name "Tom" and 2 rows for last names "Smith" and "Jones." If you think about the millions of rows you have, you likely will have thousands of Tom's and thousands of Smith and thousands of Jones and that will save you a bunch. 

Solution 2: Let's assume you have way more data than you are licensed for. Your ask for loading on-demand is definitely doable. The concept is simply to present them the values need for selections in a parent application and then when they say "Ok now I want to see the details about these selection" you can present it to them on the fly. The following article will talk about the variety of ways you can do that. If you want to have a thoroughly built out solution that pops up, but with their selections, that's called On-Demand Application Generation. I discuss it briefly but go right into another approach that is very similar called Dynamic Views. Dynamic Views are built from the same "load" code but then charts from the template application are surfaced in your parent application. 

https://dataonthe.rocks/dynamic-views-in-qlik-sense-saas/

Solution 3: If your underlying data is stored in Snowflake or Databricks our Direct Query approach will allow you to go read from your source on the fly as filters are applied.