You are right, I have also experienced the same issue with my clients using SAP. Without much options left, we have built the entire logic at QV causing timelines delay and to some extent performance issues as the data has been fetched in a lot of staging steps.
I would also like to know what approaches are adopted by others facing this challenge.
In our case, we were able to sell the idea of using the SAP reports, including some Z ones.
Nevertheless, we are getting some of the tables via de traditional connector anyways, this happens maybe just because the logic in reconstructing those is not as heavy as for say the Accounts receivable report, which is a Z in here since there is some intrinsic business logic to be considered.
However, it has been somewhat difficult so get used to the Report connector since we have the limit of the 1000 characters (which, is imposed by SAP AFAIK)
The problem I'm having right now is precisely with this last limitation, because the Zreport for the Accounts Receivable, has a great deal of denormalization and big chunks of text, and I havent been able to obtain a trustable result through the use of the wizard available y SAP Connector, or other QV manipulation (yeah sure, I can grab the rows and group them via the subfield and some other tricks but the Connector itself is not getting the whole field count anyway, I shall open a question regarding this...)
We pull like around 16 tables from SAP ECC (KONM,KONP,KONH, A tables, etc.....) and do all the logic in QV that SAP does for their reports.
Its definitely doable but it depends on a lot of factors. If all the data you are pulling from are just straight forward tables with different fields in them then I would think it could be done in QV as well.
But again it depends on what the requirement is, but definitely QV scripting is capable of doing a lot things quite easily.
It all comes down to how experienced/knowledgeable you are in QV. As always creativity is the key