I am encountering an issue that I have never had before and was wondering if someone else has encountered a similar issue.
I recently went through one of my most used data models and did some clean up work to it by removing fields that weren't used and combining some tables to clean up the schema a bit. The model loads great and the .QVW file that the model is built on runs and operates smoothly. I have done this plenty of times before so when I went to binary load it into an app I had already developed, my performance ground to a halt.
Has this happened to anyone else? The app i had already built is just a shell and can handle this type of activity easily. I've done it fairly often. It is just a template of our most common dashboard styles with some default variables already in the file and a few calculations on the first tab to make sure the model is loading correctly. What happens is when it loads now, it is just extremely bogged down and there aren't more than 5 calculations in the file. If i click too many times or attempt to add too many sheet objects too quickly i freeze out the app and have to force it shut down or wait upwards of 10 minutes for it to finish processing.
Can a binary load cause this type of issue? I don't it can't be my PC / Server that i'm developing on as every other app loads correctly and runs smoothly and i have much larger in file size apps not cause this much of a performance grind.
My only thoughts are that if a pre-made variable i had in my shell file cant calculate because the field it is referencing is gone would that cause a slowdown? I just assumed it would be like a chart referencing wrong data and display a calculation error. I don't even truly think this an issue as the model binary loaded into a new .QVW is still slow but if i develop right onto the model .QVW file it is fine.
Any thoughts or does this make sense? Let me know if you have any additional questions and thanks for any help