Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
 alexmarinho
		
			alexmarinho
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Hey guys, i have some performance problems on this scenario:
42 millions rows on a table with just 2 dimensions. One of these dimensions is a text string which size is variable.. 1 to 400.. and 80% of the values on this dimension is distinct..
I created a filter with this dimension .. but it takes around 40 seconds to render the results when i try to make a search on this filter.. i'd like something like 8 to 10 secs..
64gb ram 10 core..
any tips ?
Alessandro
 TimvB
		
			TimvB
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		I think you need On Demand App Generation (ODAG). You can read all about it here: https://help.qlik.com/en-US/sense/November2019/Subsystems/Hub/Content/Sense_Hub/DataSource/Manage-bi...
The Qlik Continious Classroom has some tutorials with examples on ODAG. You can find these tutorials under:
Qlik Continious Classroom > Learning Plans > Qlik Sense Data Architect > Advanced > Part 7: Working With Big Data > Introduction to On Demand App Generation.
Hope it helps!
 alexmarinho
		
			alexmarinho
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Thanks Tim.
But i think its not the case, this field is product name. And my business requires search by product name ( all products ). Even if i use ODAG, i need to create a filter with item name and all 42 millions rows will be use.
I'm thinking if the solution would be improve hardware specs to have a quicker response from the filter..
 TimvB
		
			TimvB
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Okay. Maybe you should analyze the Qlik Sense Operations Monitor app to figure out when your server reaches the maximum capacity. Maybe you can reschedule some reloads, drop some unnecessary fields/tables from the data model to optimize the performance. We installed a development server and a user server. The development server runs all the heavy ETL apps and the user server just runs the user apps that load the data by using binary loads. This improved the performance of the apps significantly from the users perspective.
It also helps to create some higher level dimensions for the product name and use drill-down filtering. However, you cannot directly start searching within the product name.
 alexmarinho
		
			alexmarinho
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Hello Tim. The operations monitor shows 75% of commited ram.. after i load all the data.
LOAD ProductCode, ProductName From Transformed.qvd (qvd);
Just this table on my model.
It loads very fast. Just 2 minutes ( qvd optimized).
The problem is the filter component on UI.
I was thinking about create other dimension using the first letter of the product name. Ex: A, B,C .. Filter letter A will just let the user filter products name with the first letter A. But my users wont like this.
 TimvB
		
			TimvB
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Does the app contain complex expressions in the UI, such as aggr(), nested if(), etc.? Often these expressions cause slow responses. You could try to load the data model in an app that only contains a table with the dimension ProductName and a simple measure. Test the response time in this simple UI app when filtering in the dimension ProductName. Perhaps you could move some expressions that are now calculated in the front-end to the script to improve the performance.
 alexmarinho
		
			alexmarinho
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Hello Tim. Thanks for the attention. All the UI has just one simple filter panel with just product name dimension. Without set analysis and any transformation expression, only the dimension.
I tried to limit max product name lenght to 100 characters.. but the filter response for my search just took 8 lesser. ( 40 secs to 32secs) ..
 TimvB
		
			TimvB
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Did you also check the CPU usage of the server?
I found the following post about performance improvement: https://community.qlik.com/t5/QlikView-App-Development/performance-big-app-huge-data-set-500-million...
