By this amount of records and columns it must be slow and you should rather calculate with hours instead of minutes for the export. Also it might be possible that you hit any limitation within the process and the export breaks.
Instead of exporting it to Excel you could also export it as csv which has no problem to open in Excel. Another possibility would be to store these data within the load-process.
IMO it's the wrong way to provide the users with the data and this is independent from the used tool (I doubt that any tool would be able to calculate and deliver this amount of data from a data backend + frontend to excel in a prompt manner).
Each tool needs to collect and to combine all these dimensions and perform on it the calculation of the measures, rendering the results within a visualization and then translating it into a different data-format and sending it over the network to the target. And here by around 270k of records and 93 columns it will be approximately 25M cell-values with at least 250 MB of data but probably it will be more and be rather by around 1 GB. Many of the process-steps will be probably single-threaded and therefore the whole task will need some time and it's quite probable that it runs in any timeout and/or hit any limitation.
Beside this I don't believe that the users really need all these columns and records at the same time. This means if you could leave 30 columns and splitting the export into 2 - 3 parts and/or changing the target from xls(x) into a csv you might get it to a size which will work.
But I suggest to consider the before mentioned store-approach carefully if it's not the better approach especially as I doubt that the users do anything direct analyses with the data else they will feed any kind of further processing with it. It might need some efforts to change some logics here to use the mentioned csv as a source but I think any other way will be more expensive in the end.