Performance depends on both the number of records and the number of fields. I would not recommend a fact table with 10 million rows, but 300 fields. I've even seen a case where having 600 fields caused a weird error in QlikView.
The worst performance results have occurred when the model contains 2 tables with a large number of records and fields. For this reason I stopped using link tables long ago, but your question is a good one. I don't know what has more effect on performance: a large number of fields or a large number of columns.
Look forward to the results.
I managed to split the problem in half.
I generated the minimum required records and the minimum required fields.
When I went to full blast (i.e. all records) I generated out of 30k 16 mio with a size of 150 MB.
Execution time 11 min.
When I divided the problem, I kept a 500K records with additional 23 fields. Size 6MB
Execution time less than a minute.
Now for the next part of the riddle, which is how to automate the creation of the 23 fields.