Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
rmadursk
Contributor III
Contributor III

Count issue with duplicate fields in 2 different tables

I have a set of data that looks a bit like this in one table:

 

Titles Count    
Some Pizza 139    
Magic Some Pizza 122    
Calzone Flunky 64    
Magic Calzone Flunky 47    
Magic Some Administrator 39    
Associate Some Pizza 19    
Some Pizza 16    
Master Flunky 13    
Magic Some Pizza 11    
Magic Some Administrator 10    
Magic Calzone Flunky 9    
       
       
Titles Count Other Titles Count
Some Pizza 123 - 0
Magic Some Pizza 111 - 0
Calzone Flunky 64 - 0
Magic Calzone Flunky 38 - 0
Magic Some Administrator 29 - 0
Associate Some Pizza 19 - 0
Some Pizza 16 Some Pizza 16
Master Flunky 13 - 0
Magic Some Pizza 11 Magic Some Pizza 11
Magic Some Administrator 10 Magic Some Administrator 10
Magic Calzone Flunky 9 Magic Calzone Flunky 10
       

 

The second set of data is when I add the second table that has the same fields in it, but not all of them.  I want to calculate the percentage of each group in the second table compared to the whole population in the first table. But, the table (and other charts) splits the fields into 2 rows. Like Some Pizza which has 139 members, 16 of which are in the second table (~10%). I keep getting 100% because of the way it splits them out (123 and 16).  I've tried a few expressions, including sets, but can't seem to get it to use all of the members in my calculations.

Labels (3)
3 Replies
Bhushan_Mahajan
Creator
Creator

@rmadursk You can use set expression to divide the data by using flags while loading. Though I request you to share the sample data to understand the model.

nikhilraorane
Partner - Contributor II
Partner - Contributor II

Flag your data and then use sets or u can join to tables and then achieve your 3rd column

rmadursk
Contributor III
Contributor III
Author

I played around with the flag thing a little bit today. No success but I really didn't try very hard.  I'm going to work on a data set to show my problem later because the current problem set is chock full of NPPI and very large. I'm pretty sure I can make a useful set given a bit of time.