1 Reply Latest reply: May 31, 2017 3:50 PM by Sunny Talwar RSS

    Why GetSelectedCount () changes the display ...?

    Rene Dimarco

      Hi everyone,

       

      I have a model on Qlick Sense that show the sales behavior of a specific category (Events), for a certain period of time. We currently have registers from sales for three years: 2015-2017.

       

      The objective was to show, by default, the 2017 results, as long as there wasn´t any year selected on  the "SalesYear" filter (AñoVenta in Spanish).

       

      Tha´ts why I decided to use the function GetSelectedCount() that returns the number of selections active on a filter. If this number is cero then there are no active selections, which then should trigger the default behavior of showing just 2017.

       

      The expression I used was:

       

      if (GetSelectedCount([AñoVenta])= 0,

      (Sum({1<[Categoría_Ingreso]={"Eventos"}>

           *1<[AñoVenta]={"2017"}>

      }ValorBaseFacturado)/1000000),

       

      (Sum({$<[Categoría_Ingreso]={"Eventos"}>}

      ValorBaseFacturado)/1000000))

       

      On the chart, however, all years are show in the dimension axis, with only the months of 2017 showing any values. The others are shown with values of 0:

       

      bad.png

       

      In contrast, this is the original expression:

       

      Sum({$<[Categoría_Ingreso]={"Eventos"}>

           *1<[AñoVenta]={"2017"}>}

      ValorBaseFacturado)/1000000

       

      That shows this chart, which is the one I want:

       

      good.png

      I would like to know why including that "if" changes the chart so much, and if there is a way to avoid that...

       

      This is how the screen looks with the filters:

       

      Captura de pantalla 2017-05-31 11.09.21.png

       

      Thanks a lot for any help you guys can provide...