Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I'm new to Qlikview and I'm trying to generate a variance chart on the difference of objects weights between the first time the object is weighted and the second time.
I have a table containing the fields object-ID, weight-index and weight. There are objects with only one weighting (weight-index = 1), objects with both a weight-index 1 and 2, and a few objects with only a second weighting (weight-index = 2)
Now I want a chart that shows me, for all objects with 2 weightings, how many objects there are per difference between first and second weighting.
for example:
object_id | weight_index | weight |
1 | 1 | 3.09 |
2 | 1 | 3.05 |
2 | 2 | 3.03 |
3 | 2 | 3.01 |
4 | 1 | 3.02 |
4 | 2 | 3.07 |
5 | 1 | 3.03 |
5 | 2 | 3.05 |
6 | 1 | 3.05 |
I managed to have a pivot-chart showing
object_id | weight 1 | weight 2 | difference |
2 | 3.05 | 3.03 | -0.02 |
4 | 3.02 | 3.07 | 0.05 |
5 | 3.03 | 3.05 | 0.02 |
using object_id as dimension and
weight 1 = sum({<weight_index = {1}, object_id = P({$<weight_index = {2}>}) - E({$<weight_index = {1}>})>} weight)
weight 2 = sum({<weight_index = {2}, object_id = P({$<weight_index = {2}>}) - E({$<weight_index = {1}>})>} weight)
difference = [weight 2] - [weight 1]
as expressions
But now I want a chart which shows
difference | amount of objects |
-0.02 | 1 |
0.05 | 1 |
0.02 | 1 |
(ofcourse in this example is the amount always 1, but that won't be the case with my real data)
How can I do this best?
I have been experimenting with calculated dimensions as such:
difference = aggr({$<object_id = P({$<weight_index = {2}>}) - E({$<weight_index = {1}>})>} sum({<weight_index = {2}>} weight) - sum({<weight_index = {1}>} weight), object_id)
andas expression:
amount of objects = count(object_id)
But that gives strange results..
I found the solution: I have to use the round function before I aggregate..
This makes my calculated dimension like this:
=aggr(round(sum({$<object_id = P({$<weight_index = {2}>}) - E({$<weight_index = {1}>}), weight_index= {2}>} weight) - sum({<object_id = P({$<weight_index= {2}>}) - E({$<weight_index= {1}>}), weight_index= {1}>} weight), 0.01), object_id)
I still think this is a problem of qlikview that it internally works with floating point numbers (which don't always have an exact representation of a number, but use a value as close as possible) and nowhere to define what kind of format the numbers of the source file are, or built-in corrections (like automatically applying the round function) in calculations to prevent this kind of unexpected behaviour
A script solution:
,if(object_id=previous(object_id),weight-previous(weight)) as difference
dimension = difference (suppress when null)
expression = count(distinct object_id)
A chart solution (looks like you were close):
dimension = aggr(if(count(weight_index)=2,sum({<weight_index={2}>} weight)-sum({<weight_index={1}>} weight)),object_id)
expression = count(distinct object_id)
See attached.
Thanks for your reply. However, I get exactly the same results with your calculated dimension as with mine (except that with your dimension, there is the danger that an object_id with two times a weight_index = 1 is also considered (this happens occasionaly when an object is accidently weighted two times), therefore my explicit check on weight_index = 1 and weight_index = 2)
The result that I currently have (both with your and my expression) is like this (there are 2 types of objects: closed or open which I didn't mention before, however the id is unique, so that shouldn't matter):
Variance | ||
Difference | # open objects | # closed objects |
---|---|---|
104 | 197 | |
-0,05 | 0 | 1 |
-0,04 | 3 | 5 |
-0,03 | 1 | 8 |
-0,03 | 2 | 3 |
-0,02 | 16 | 36 |
-0,01 | 19 | 20 |
-0,0099999999999998 | 16 | 33 |
0 | 21 | 45 |
0,0099999999999998 | 2 | 17 |
0,01 | 8 | 12 |
0,02 | 1 | 0 |
0,02 | 3 | 10 |
0,03 | 4 | 0 |
0,03 | 2 | 2 |
0,04 | 2 | 0 |
0,04 | 2 | 1 |
0,05 | 1 | 0 |
0,06 | 0 | 2 |
0,07 | 0 | 1 |
0,09 | 0 | 1 |
0,11 | 1 | 0 |
Like you can see: I get 2 times -0.03, 2x 0.02, 2x 0.03, 2x 0.04 and even a -0.0099999999999998 and 0.0099999999999998 while the maximum accuracy of the source (csv-)file is 0.010 (formatted like 0.000 but accurate to 0.01)
In another chart which shows the difference between the 2 weightings I also see these rounding errors:
Bakgewicht per weging | ||||
Object ID | object_type | weight 1 | weight 2 | difference |
---|---|---|---|---|
... | ... | ... | ... | ... |
105156 | Gesloten | 3,22 | 3,21 | -0,01 |
153581 | Gesloten | 3,16 | 3,15 | -0,01 |
162698 | Gesloten | 3,16 | 3,15 | -0,01 |
105664 | Gesloten | 3,22 | 3,21 | -0,01 |
159974 | Open | 3,04 | 3,03 | -0,01 |
159970 | Open | 3,02 | 3,01 | -0,01 |
122671 | Gesloten | 3,07 | 3,06 | -0,0099999999999998 |
122414 | Gesloten | 3,05 | 3,04 | -0,0099999999999998 |
123925 | Gesloten | 3,03 | 3,02 | -0,0099999999999998 |
120697 | Gesloten | 3,05 | 3,04 | -0,0099999999999998 |
126694 | Gesloten | 3,09 | 3,08 | -0,0099999999999998 |
126278 | Gesloten | 3,07 | 3,06 | -0,0099999999999998 |
124399 | Gesloten | 3,07 | 3,06 | -0,0099999999999998 |
112944 | Open | 3,05 | 3,04 | -0,0099999999999998 |
127231 | Gesloten | 3,07 | 3,06 | -0,0099999999999998 |
131458 | Gesloten | 3,09 | 3,08 | -0,0099999999999998 |
113134 | Open | 3,03 | 3,02 | -0,0099999999999998 |
107337 | Open | 3,01 | 3 | -0,0099999999999998 |
115471 | Open | 3,03 | 3,02 | -0,0099999999999998 |
114649 | Open | 3,01 | 3 | -0,0099999999999998 |
114399 | Open | 3,03 | 3,02 | -0,0099999999999998 |
115148 | Open | 3,05 | 3,04 | -0,0099999999999998 |
116171 | Open | 3,03 | 3,02 | -0,0099999999999998 |
116086 | Open | 3,05 | 3,04 | -0,0099999999999998 |
102434 | Open | 3,05 | 3,04 | -0,0099999999999998 |
102657 | Open | 3,05 | 3,04 | -0,0099999999999998 |
102728 | Open | 3,05 | 3,04 | -0,0099999999999998 |
103221 | Open | 3,05 | 3,04 | -0,0099999999999998 |
155772 | Gesloten | 3,15 | 3,14 | -0,0099999999999998 |
154394 | Gesloten | 3,11 | 3,1 | -0,0099999999999998 |
153644 | Gesloten | 3,13 | 3,12 | -0,0099999999999998 |
152958 | Gesloten | 3,13 | 3,12 | -0,0099999999999998 |
150063 | Gesloten | 3,13 | 3,12 | -0,0099999999999998 |
155841 | Gesloten | 3,11 | 3,1 | -0,0099999999999998 |
150173 | Gesloten | 3,13 | 3,12 | -0,0099999999999998 |
155381 | Gesloten | 3,13 | 3,12 | -0,0099999999999998 |
150913 | Gesloten | 3,15 | 3,14 | -0,0099999999999998 |
103526 | Open | 3,05 | 3,04 | -0,0099999999999998 |
103998 | Open | 3,03 | 3,02 | -0,0099999999999998 |
159150 | Open | 3,05 | 3,04 | -0,0099999999999998 |
161172 | Gesloten | 3,17 | 3,16 | -0,0099999999999998 |
158605 | Gesloten | 3,19 | 3,18 | -0,0099999999999998 |
110337 | Gesloten | 3,15 | 3,14 | -0,0099999999999998 |
109680 | Gesloten | 3,15 | 3,14 | -0,0099999999999998 |
111909 | Gesloten | 3,19 | 3,18 | -0,0099999999999998 |
108670 | Gesloten | 3,21 | 3,2 | -0,0099999999999998 |
158077 | Gesloten | 3,21 | 3,2 | -0,0099999999999998 |
158703 | Gesloten | 3,21 | 3,2 | -0,0099999999999998 |
110176 | Gesloten | 3,17 | 3,16 | -0,0099999999999998 |
158713 | Gesloten | 3,21 | 3,2 | -0,0099999999999998 |
108167 | Gesloten | 3,21 | 3,2 | -0,0099999999999998 |
105928 | Gesloten | 3,21 | 3,2 | -0,0099999999999998 |
105305 | Gesloten | 3,21 | 3,2 | -0,0099999999999998 |
105583 | Gesloten | 3,21 | 3,2 | -0,0099999999999998 |
158827 | Gesloten | 3,21 | 3,2 | -0,0099999999999998 |
100457 | Open | 3,02 | 3,02 | 0 |
122297 | Gesloten | 3,08 | 3,08 | 0 |
120131 | Gesloten | 3,05 | 3,05 | 0 |
120216 | Gesloten | 3,04 | 3,04 | 0 |
121615 | Gesloten | 3,07 | 3,07 | 0 |
121685 | Gesloten | 3,04 | 3,04 | 0 |
121694 | Gesloten | 3,04 | 3,04 | 0 |
126926 | Gesloten | 3,06 | 3,06 | 0 |
125325 | Gesloten | 3,03 | 3,03 | 0 |
125090 | Gesloten | 3,07 | 3,07 | 0 |
124568 | Gesloten | 3,06 | 3,06 | 0 |
120335 | Gesloten | 3,06 | 3,06 | 0 |
125334 | Gesloten | 3,04 | 3,04 | 0 |
132145 | Gesloten | 3,07 | 3,07 | 0 |
133792 | Gesloten | 3,06 | 3,06 | 0 |
130493 | Gesloten | 3,07 | 3,07 | 0 |
109832 | Gesloten | 3,15 | 3,15 | 0 |
161699 | Gesloten | 3,18 | 3,18 | 0 |
159978 | Open | 3,03 | 3,03 | 0 |
123847 | Gesloten | 3,02 | 3,03 | 0,0099999999999998 |
122115 | Gesloten | 3 | 3,01 | 0,0099999999999998 |
124666 | Gesloten | 3,06 | 3,07 | 0,0099999999999998 |
126493 | Gesloten | 3,06 | 3,07 | 0,0099999999999998 |
126727 | Gesloten | 3,06 | 3,07 | 0,0099999999999998 |
122795 | Gesloten | 3,06 | 3,07 | 0,0099999999999998 |
123473 | Gesloten | 3,04 | 3,05 | 0,0099999999999998 |
130472 | Gesloten | 3,06 | 3,07 | 0,0099999999999998 |
131658 | Gesloten | 3,06 | 3,07 | 0,0099999999999998 |
130629 | Gesloten | 3,06 | 3,07 | 0,0099999999999998 |
127847 | Gesloten | 3,06 | 3,07 | 0,0099999999999998 |
128986 | Gesloten | 3,08 | 3,09 | 0,0099999999999998 |
128769 | Gesloten | 3,06 | 3,07 | 0,0099999999999998 |
113851 | Open | 3 | 3,01 | 0,0099999999999998 |
114665 | Open | 3,04 | 3,05 | 0,0099999999999998 |
122695 | Gesloten | 3,08 | 3,09 | 0,0099999999999998 |
153752 | Gesloten | 3,12 | 3,13 | 0,0099999999999998 |
151202 | Gesloten | 3,12 | 3,13 | 0,0099999999999998 |
108742 | Gesloten | 3,2 | 3,21 | 0,0099999999999998 |
130612 | Gesloten | 3,07 | 3,08 | 0,01 |
133624 | Gesloten | 3,05 | 3,06 | 0,01 |
127010 | Gesloten | 3,05 | 3,06 | 0,01 |
132636 | Gesloten | 3,05 | 3,06 | 0,01 |
132761 | Gesloten | 3,05 | 3,06 | 0,01 |
131156 | Gesloten | 3,07 | 3,08 | 0,01 |
135058 | Gesloten | 3,07 | 3,08 | 0,01 |
129510 | Gesloten | 3,07 | 3,08 | 0,01 |
135048 | Gesloten | 3,05 | 3,06 | 0,01 |
... | ... | ... | ... | ... |
The dimensions here are object_id and object_type
The expressions:
weight 1 = sum({<weight_index= {1}, object_id = P({$<weight_index = {2}>}) - E({$<weight_index = {1}>})>} weight)
weight 2 = sum({<weight_index= {2}, object_id = P({$<weight_index = {2}>}) - E({$<weight_index = {1}>})>} weight)
difference = [weight 2] - [weight 1]
The number format is set to "expression default"
and the accuracy of the input-file is 0.010
So I assume the weird result I get is because of some rounding errors which aren't even displayed (like those 2x -0.03) .. But I don't understand where and why qlikview is making those errors using normal additions and substraction.
Here you can see a sniplet of the import file:
122297|12.01.2010|1|3,080|
114665|12.01.2010|0|3,050|
155225|12.01.2010|1|3,150|
159155|12.01.2010|0|3,040|
122795|12.01.2010|1|3,070|
135402|12.01.2010|1|3,050|
122414|12.01.2010|1|3,040|
155432|12.01.2010|1|3,110|
102553|12.01.2010|1|3,170|
124873|12.01.2010|1|3,030|
120697|12.01.2010|1|3,040|
131458|12.01.2010|1|3,080|
125334|12.01.2010|1|3,040|
105664|12.01.2010|1|3,210|
154990|12.01.2010|1|3,140|
122695|12.01.2010|1|3,090|
121685|12.01.2010|1|3,040|
121615|12.01.2010|1|3,070|
And this is the script I use to import the file:
object_data:
LOAD @1 as object_id, // Bak ID
MakeDate(Mid(@2, 7, 4), Mid(@2, 4, 2), Mid(@2, 1, 2)) as object_scandate
@3 as type_id,
@4 as weight,
1 as weight_index
FROM
D:\object_data_first_weight.txt
(txt, codepage is 1252, no labels, delimiter is '|', no quotes);
LOAD @1 as object_id, // Bak ID
MakeDate(Mid(@2, 7, 4), Mid(@2, 4, 2), Mid(@2, 1, 2)) as object_scandate
@3 as type_id,
@4 as weight,
2 as weight_index
FROM
D:\object_data_second_weight.txt
(txt, codepage is 1252, no labels, delimiter is '|', no quotes);
Hi,
Can't you use the Number tab in the chart properties and round off your expression to the number of decimals you want? You will be getting the same result this way too.
Regards,
Syed.
hi,
I mean set the number format to Fixed 3 decimals.
I can set the number format to fixed, which I did before.. and that way the chart containing both the weights per object_id indeed looks better.. In the calculated dimension I can't use the number formatting, but there I can use the the num() function and visually it indeed formats the numbers.. but the numbers are still listed multiple times, as they are actually (internaly) not the same, but only displayed the same:
Variantie | ||
Difference | # open objects | # closed objects |
---|---|---|
104 | 197 | |
-0,050 | 0 | 1 |
-0,040 | 3 | 5 |
-0,030 | 1 | 8 |
-0,030 | 2 | 3 |
-0,020 | 16 | 36 |
-0,010 | 19 | 20 |
-0,010 | 16 | 33 |
0,000 | 21 | 45 |
0,010 | 2 | 17 |
0,010 | 8 | 12 |
0,020 | 1 | 0 |
0,020 | 3 | 10 |
0,030 | 4 | 0 |
0,030 | 2 | 2 |
0,040 | 2 | 0 |
0,040 | 2 | 1 |
0,050 | 1 | 0 |
0,060 | 0 | 2 |
0,070 | 0 | 1 |
0,090 | 0 | 1 |
0,110 | 1 | 0 |
I still don't get why Qlikview gives me this: 3.07 - 3.06 = -0.0099999999999998
while it should give -0.01.. There is no way that there can be a rounding error in such a simple substraction? is there?
In the source data-file those numbers aree 3,070 and 3,060.. and 3.070 - 3.060 should still give me -0.010
And I'm sure this is the cause of my problem: the duplicate 'difference' value's in my chart above..; Qlikview makes some rounding errors, but doesn't display them as they are too small (or in the current case formatted fixed to 3), hence the actual internal values are not the same for qlikview thus are displayed as individual numbers visually giving duplicate values.
I found the solution: I have to use the round function before I aggregate..
This makes my calculated dimension like this:
=aggr(round(sum({$<object_id = P({$<weight_index = {2}>}) - E({$<weight_index = {1}>}), weight_index= {2}>} weight) - sum({<object_id = P({$<weight_index= {2}>}) - E({$<weight_index= {1}>}), weight_index= {1}>} weight), 0.01), object_id)
I still think this is a problem of qlikview that it internally works with floating point numbers (which don't always have an exact representation of a number, but use a value as close as possible) and nowhere to define what kind of format the numbers of the source file are, or built-in corrections (like automatically applying the round function) in calculations to prevent this kind of unexpected behaviour