I'll admit may be confused by a couple of statements in your question. But I clarify that you can only have one BINARY load in a script. Two is not allowed.
In your final dashboard, why not just load the QVD(s) you created? It's unlimited how may QVDs you can load.
As Rob pointed out - you can only do one binary load in your script, and it has to be the first statement.
The binary load basically takes everything from one existing .qvw file and imports it into another .qvw file. So now you're 'stuck', as you can not run a second binary load from a second 'source' qvw file into your target .qvw.
So, after the first binary load, you could possibly then continue loading more data by loading additional .qvd data files and merging them into the data model that you've got in your target .qvw file. (This allows you to skip at least one .qvd file load.) Sort of clumsy - as this assumes that you're hitting a specific source QVW file, and then doing additional loads from QVD files.
You might try this:
Create extract qvw files with scripts that pull data from your different excel files or other data sources, and then write those sources to .qvd data files. This is something like what you've done with your two different .qvw files right now, except you can use one single extract script, with different tabs to pull from different data sources, and then writing the extracted data into the binary .qvd data format which can be used in multiple places.
Next, build a "data model" .qvw file, with scripts that load data from 1 to many .qvd data files, and merge the data into your common table. You may be doing additional manipulation here (or in the initial extract scripts) to make sure that the data fields all match.
Once your data model .qvw is done doing all the data loading, merging, and transformations, you could then do a binary load into your final dashboard .qvw file, where you have all of your presentation setup (ie, the user .qvw file).
This tiered approach lets you merge multiple data sources into a common data set, do any transformations or manipulations on the input data, and create a coherent data set on the 'back end', without impacting anyone using the front end user dashboard. Once all the heavy work is done on the backend, you can then do a quick binary load to update your dashboard.
Hope that this helps.