QlikView App Development

Discussion Board for collaboration related to QlikView App Development.

Announcements
Make your voice heard! Participate in the 2020 Wisdom of Crowds® Survey. BEGIN SURVEY
Highlighted
Partner

Re: Daily Sales Outstanding (DSO) calculation as of date

Attached the updated excel with DSO as of date expected and the calculation for same in Remarks column.

Solution expectation is anyway either purely on back-end or front-end or mix of both worlds.

Really appreciate your support and time.

Best Regards

Rajat Arora

Highlighted
Partner

Re: Daily Sales Outstanding (DSO) calculation as of date

Hi Sunny,

Any luck on this?

I am trying n ways but getting stuck always piling up the data.

Have you found any solution to this?

Best Regards

Rajat Arora

Highlighted
MVP

Re: Daily Sales Outstanding (DSO) calculation as of date

Sorry about that, will check today

Highlighted
MVP

Re: Daily Sales Outstanding (DSO) calculation as of date

How did you come up with 12.4 for Customer B?

I tried bunch of calculation, but couldn't get close to it... from what I understood, it should be 8000/((6000+4000)/14) which gives me 11.20... but I might have missed the logic somewhere

Partner

Re: Daily Sales Outstanding (DSO) calculation as of date

Hi Sunny,

For 1 Jan, 2018 and for customer B, it will be 25.8.

Calculation goes like this.

8000/((6000+4000)/31) = 24.8

Though, sales doesn't happen in all the days of Dec, but to get the monthly average, sales divided by total 31 days in month.

Highlighted
MVP

Re: Daily Sales Outstanding (DSO) calculation as of date

This is much more challenging then I imagined... give me some time and I will get back to you...

Highlighted
Partner

Re: Daily Sales Outstanding (DSO) calculation as of date

Your efforts are very much appreciated sunny.

Just to update, I have arrived at a solution which does the whole calculation in the script and gives the expected output.

But that logic requires to make a cross join, which increases the data volume heavily.

Though the data in next 2 steps cuts down with some aggregation and where conditions, but it consumes lot of CPU for that many records.

It would be great if you can also suggest some solution which might reduce the load and data volume.